问题
Environment: Spark 1.6 ; Scala
Simple question, but I did not get accurate answer. I have a dataframe DF
id | cr_date
-----------------------
1| 2017-03-17 11:12:00
----------------------
2|2017-03-17 15:10:00
I need to minus 5 minutes from cr_date. I tried
val DF2= DF.select ($"cr_Date".cast("timestamp").minusMinutes(5))
// Did not work
Any suggestion? Thanks
回答1:
df.select(from_unixtime(unix_timestamp(col("cr_dt")).minus(5 * 60), "YYYY-MM-dd HH:mm:ss"))
There is no such minusMinutes
method available in spark.
The above code should return expected results.
回答2:
In case anyone runs into the same issue, I found that using the above method maintains the year. For example I had a data frame with the timestamp "2015-01-01 00:00:00" when applying:
df.select(from_unixtime(unix_timestamp(col("cr_dt")).minus(5 * 60), "YYYY-MM-dd HH:mm:ss"))
I got the result "2015-12-31 23:55:00" however my expected result was "2014-12-31 23:55:00". It seems that this is due to having "YYYY" as opposed to "yyyy". Making this change:
df.select(from_unixtime(unix_timestamp(col("cr_dt")).minus(5 * 60), "yyyy-MM-dd HH:mm:ss"))
Gives the result I was looking for.
来源:https://stackoverflow.com/questions/43039217/dataframe-minus-minutes-from-timestamp-column