问题
When I use DATE_FORMAT for Dec 31st date 2018 year is getting changed to 2019. Can someone help say if this is a bug or I am missing something.
import org.apache.spark.sql.functions._
spark.sql("select CAST(1546268400 AS TIMESTAMP)").show(false)
Output: 2018-12-31 15:00:00.0
spark.sql("select DATE_FORMAT(CAST(1546268400 AS TIMESTAMP), 'MM/dd/YYYY HH:mm')").show(false)
Output: 12/31/2019 15:00
回答1:
So this doesn't exactly answer your question but the use of YYYY
vs yyyy
seems crucial here. Still investigating actually, but this might help you to figure it out as well.
Update: https://github.com/davedelong/calendar_fallacies/issues/26
The distinction between YYYY
and yyyy
is ISO_Week Year vs calendar year.
来源:https://stackoverflow.com/questions/54496878/date-format-conversion-is-adding-1-year-to-the-border-dates