Spark SQL converting string to timestamp

前端 未结 2 1995
梦谈多话
梦谈多话 2020-12-31 02:36

I\'m new to Spark SQL and am trying to convert a string to a timestamp in a spark data frame. I have a string that looks like \'2017-08-01T02:26:59.000Z\' in a

相关标签:
2条回答
  • 2020-12-31 03:31

    You could use unix_timestamp function to convert the utc formatted date to timestamp

    val df2 = Seq(("a3fac", "2017-08-01T02:26:59.000Z")).toDF("id", "eventTime")
    
    df2.withColumn("eventTime1", unix_timestamp($"eventTime", "yyyy-MM-dd'T'HH:mm:ss.SSS'Z'").cast(TimestampType))
    

    Output:

    +-------------+---------------------+
    |userid       |eventTime            |
    +-------------+---------------------+
    |a3fac        |2017-08-01 02:26:59.0|
    +-------------+---------------------+
    

    Hope this helps!

    0 讨论(0)
  • 2020-12-31 03:31

    Solution on Java

    There are some Spark SQL functions which let you to play with the date format.

    Conversion example : 20181224091530 -> 2018-12-24 09:15:30

    Solution (Spark SQL statement) :

    SELECT
     ...
     to_timestamp(cast(DECIMAL_DATE as string),'yyyyMMddHHmmss') as `TIME STAMP DATE`,
     ...
    FROM some_table
    

    You can use the SQL statements by using an instance of org.apache.spark.sql.SparkSession. For example if you want to execute an sql statement, Spark provide the following solution:

    ...
    // You have to create an instance of SparkSession
    sparkSession.sql(sqlStatement); 
    ...
    

    Notes:

    • You have to convert the decimal to string and after you can achieve the parsing to timestamp format
    • You can play with the format the get however format you want...
    0 讨论(0)
提交回复
热议问题