I\'m sure there\'s a reason I have to add three zeros to every Unix timestamp in JavaScript in order to get the correct date. Can you tell me why? Is it as simple as millise
Unix time is the number of seconds since the epoch (1 Jan 1970). In Javascript, the Date
object expects the number of milliseconds since the epoch, hence the 1000-fold difference.
Javascript uses the number of milliseconds since epoch. Unix timestamp is seconds since epoch.
Hence, the need to convert Unix timestamp into millseconds before using it in Javascript
Because Javascript uses milliseconds internally, while normal UNIX timestamps are usually in seconds.