问题
I'm sure there's a reason I have to add three zeros to every Unix timestamp in JavaScript in order to get the correct date. Can you tell me why? Is it as simple as milliseconds since the epoch vs. seconds?
回答1:
Because Javascript uses milliseconds internally, while normal UNIX timestamps are usually in seconds.
回答2:
Javascript uses the number of milliseconds since epoch. Unix timestamp is seconds since epoch.
Hence, the need to convert Unix timestamp into millseconds before using it in Javascript
回答3:
Unix time is the number of seconds since the epoch (1 Jan 1970). In Javascript, the Date
object expects the number of milliseconds since the epoch, hence the 1000-fold difference.
来源:https://stackoverflow.com/questions/4676195/why-do-i-need-to-multiply-unix-timestamps-by-1000-in-javascript