Spark SQL datediff in seconds

前端 未结 1 1696
渐次进展
渐次进展 2021-02-20 08:56

I\'ve got following code:

table.select(datediff(table.col(\"Start Time\"), table.col(\"End Time\"))).show()

Date format is 2

相关标签:
1条回答
  • 2021-02-20 09:44

    You can use unix_timestamp() function to convert date to seconds.

    import org.apache.spark.sql.functions._
    
    //For $ notation columns // Spark 2.0
    import spark.implicits._
    
    table.withColumn("date_diff", 
       (unix_timestamp($"Start Time") - unix_timestamp($"End Time"))
    ).show()
    

    Edit:(As per comment)

    UDF to covert Seconds to HH:mm:ss

    sqlContext.udf.register("sec_to_time", (s: Long) => 
       ((s / 3600L) + ":" + (s / 60L) + ":" + (s % 60L))
    )
    
    //Use registered UDF now
    table.withColumn("date_diff", 
       sec_to_time(unix_timestamp($"Start Time") - unix_timestamp($"End Time"))
    ).show()
    
    0 讨论(0)
提交回复
热议问题