问题
I'm getting dates from a database in SQL Server.
When I get the DateTime
from the DB and I compare the milliseconds in C# and SQL Server.
I see that they are not the same.
Why is that?
回答1:
SQL Server's datetime
data type is not accurate at the milliseconds level.
Official documentation provides a list of properties of the data time data type.
In that list, you will find the following row:
Accuracy Rounded to increments of .000, .003, or .007 seconds
You will also find, in that same page, the following quote:
Note
Use the time, date, datetime2 and datetimeoffset data types for new work. These types align with the SQL Standard. They are more portable. time, datetime2 and datetimeoffset provide more seconds precision. datetimeoffset provides time zone support for globally deployed applications.
If you would work with DateTime2 isntead of DateTime
, you would get 100 nanoseconds accuracy, among other benefits.
In fact, except in cases when you need to maintain backwards compatibility, you should not work with DateTime
at all, only with the newer data types.
来源:https://stackoverflow.com/questions/55449221/milliseconds-from-datetime-in-sql-server-and-c-sharp-are-not-the-same