问题
I'm trying to find better way to convert DateTime to unix timestamp in C#
I find out that there is a DateTimeOffset.ToUnixTimeMilliseconds method:
public long ToUnixTimeMilliseconds()
{
return this.UtcDateTime.Ticks / 10000L - 62135596800000L;
}
What this method means? What the constants are used?
UPD: I guess 10000L is converting from FRAME to Milliseconds. But what about 62135596800000L?
回答1:
To explain this method:
public long ToUnixTimeMilliseconds()
{
return this.UtcDateTime.Ticks / 10000L - 62135596800000L;
}
DateTime.Ticks units are 100 nanosecond intervals.
Dividing this by 10_000 yields milliseconds, which explains the division by 10000L.
This is because one nanosecond is one billionth of a second, or one millionth of a millisecond.
To convert a nanosecond to a millisecond you would therefore divide by 1_000_000.
However, the ticks are 100 nanosecond units, so instead of dividing by 1_000_000 you would have to divide by 1_000_000/100 = 10_000. That's why you divide the 100 nanosecond units by 10_000 to get milliseconds.
The Unix epoch (which corresponds to a Unix time of zero) is midnight on 1st January 1970.
The DateTime epoch (which corresponds to a DateTime.Ticks value of zero) is 1st January 0001.
The number of milliseconds between 1st January 0001 and 1st January 1970 is 62135596800000. This explains the subtraction of 62135596800000.
And there you have it!
Note: You can compute an approximate value for the number of milliseconds as follows:
Approximate number of days per year = 365.24219
Number of years between 0001 and 1970 = 1969
Thus, total approx milliseconds = 1969 * 365.24219 * 24 * 60 * 60 * 1000
= 62135585750000
The exact figure is much harder to calculate, but it comes out to 62135596800000 as used in the formula above.
In fact, from inspection of the source code we can find the following:
public long ToUnixTimeSeconds() {
// Truncate sub-second precision before offsetting by the Unix Epoch to avoid
// the last digit being off by one for dates that result in negative Unix times.
//
// For example, consider the DateTimeOffset 12/31/1969 12:59:59.001 +0
// ticks = 621355967990010000
// ticksFromEpoch = ticks - UnixEpochTicks = -9990000
// secondsFromEpoch = ticksFromEpoch / TimeSpan.TicksPerSecond = 0
//
// Notice that secondsFromEpoch is rounded *up* by the truncation induced by integer division,
// whereas we actually always want to round *down* when converting to Unix time. This happens
// automatically for positive Unix time values. Now the example becomes:
// seconds = ticks / TimeSpan.TicksPerSecond = 62135596799
// secondsFromEpoch = seconds - UnixEpochSeconds = -1
//
// In other words, we want to consistently round toward the time 1/1/0001 00:00:00,
// rather than toward the Unix Epoch (1/1/1970 00:00:00).
long seconds = UtcDateTime.Ticks / TimeSpan.TicksPerSecond;
return seconds - UnixEpochSeconds;
}
// Number of days in a non-leap year
private const int DaysPerYear = 365;
// Number of days in 4 years
private const int DaysPer4Years = DaysPerYear * 4 + 1; // 1461
// Number of days in 100 years
private const int DaysPer100Years = DaysPer4Years * 25 - 1; // 36524
// Number of days in 400 years
private const int DaysPer400Years = DaysPer100Years * 4 + 1; // 146097
// Number of days from 1/1/0001 to 12/31/1600
private const int DaysTo1601 = DaysPer400Years * 4; // 584388
// Number of days from 1/1/0001 to 12/30/1899
private const int DaysTo1899 = DaysPer400Years * 4 + DaysPer100Years * 3 - 367;
// Number of days from 1/1/0001 to 12/31/1969
internal const int DaysTo1970 = DaysPer400Years * 4 + DaysPer100Years * 3 + DaysPer4Years * 17 + DaysPerYear; // 719,162
Which we can now use to calculate the number of milliseconds to 1970:
719162 (DaysTo1970) * 24 (hours) * 60 (minutes) * 60 (seconds) * 1000 (milliseconds)
= 621355967990000
来源:https://stackoverflow.com/questions/58391011/explain-tounixtimemilliseconds