问题
This is a bizarre but I'm sure there's a perfectly good explanation.
My team and I recently discovered when using java's LocalDate and sending it back to the frontend with the default string format "YYYY-MM-DD", Javascript will automatically create a date assuming that the string was UTC, so living in ET zone, it automatically subtracts -5 hours.
Annoying, but we get it.
However, when we send it back with time as so "YYYY-MM-DDThh:mm:ss", it parses it as a local date. Ok, weird... but it gets weirder.
Now the bizarre part, if we send the string without the 0 padding on the date as so "YYYY-MM-D" it parses it as a local date. Why?
Here's an example:
new Date("2017-12-09")
// output: Fri Dec 08 2017 19:00:00 GMT-0500 (Eastern Standard Time)
new Date("2017-12-9")
// output: Sat Dec 09 2017 00:00:00 GMT-0500 (Eastern Standard Time)
Why is this???
回答1:
The answer relies on your browser implementation.
See https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Date
Note: parsing of date strings with the Date constructor (and Date.parse, they are equivalent) is strongly discouraged due to browser differences and inconsistencies. Support for RFC 2822 format strings is by convention only. Support for ISO 8601 formats differs in that date-only strings (e.g. "1970-01-01") are treated as UTC, not local.
来源:https://stackoverflow.com/questions/53542484/why-does-js-assume-string-date-without-time-is-utc-if-0-padded-and-local-if-not