Why do I need to multiply unix timestamps by 1000 in JavaScript?

混江龙づ霸主 提交于 2019-12-12 07:24:43

问题


I'm sure there's a reason I have to add three zeros to every Unix timestamp in JavaScript in order to get the correct date. Can you tell me why? Is it as simple as milliseconds since the epoch vs. seconds?


回答1:


Because Javascript uses milliseconds internally, while normal UNIX timestamps are usually in seconds.




回答2:


Javascript uses the number of milliseconds since epoch. Unix timestamp is seconds since epoch.

Hence, the need to convert Unix timestamp into millseconds before using it in Javascript




回答3:


Unix time is the number of seconds since the epoch (1 Jan 1970). In Javascript, the Date object expects the number of milliseconds since the epoch, hence the 1000-fold difference.



来源:https://stackoverflow.com/questions/4676195/why-do-i-need-to-multiply-unix-timestamps-by-1000-in-javascript

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!