问题
I am trying to get the local time in spark-scala but it is returning UTC.
I am using java.time.LocalDateTime to get the current timestamp. But its returning the UTC standard.
java.sql.Timestamp.valueOf(DateTimeFormatter.ofPattern("YYYY-MM-dd HH:mm:ss.SSSSSS").format(LocalDateTime.now))
The LocalDateTime is returning local time in spark shell, but in my code it is giving UTC standard.
val time: LocalDateTime = LocalDateTime.now
How to get the current time?
The current output is UTC. I need the local time. I need to change the zone.
回答1:
Use current_timestamp()
in org.apache.spark.sql.fuctions
, this gives local time.
来源:https://stackoverflow.com/questions/58143743/how-to-get-the-current-local-time-or-system-time-in-spark-scala-dataframe