Can unix_timestamp() return unix time in milliseconds in Apache Spark?
问题 I'm trying to get the unix time from a timestamp field in milliseconds (13 digits) but currently it returns in seconds (10 digits). scala> var df = Seq("2017-01-18 11:00:00.000", "2017-01-18 11:00:00.123", "2017-01-18 11:00:00.882", "2017-01-18 11:00:02.432").toDF() df: org.apache.spark.sql.DataFrame = [value: string] scala> df = df.selectExpr("value timeString", "cast(value as timestamp) time") df: org.apache.spark.sql.DataFrame = [timeString: string, time: timestamp] scala> df = df