I\'m new to Spark SQL and am trying to convert a string to a timestamp in a spark data frame. I have a string that looks like \'2017-08-01T02:26:59.000Z\'
in a
You could use unix_timestamp function to convert the utc formatted date to timestamp
val df2 = Seq(("a3fac", "2017-08-01T02:26:59.000Z")).toDF("id", "eventTime")
df2.withColumn("eventTime1", unix_timestamp($"eventTime", "yyyy-MM-dd'T'HH:mm:ss.SSS'Z'").cast(TimestampType))
Output:
+-------------+---------------------+
|userid |eventTime |
+-------------+---------------------+
|a3fac |2017-08-01 02:26:59.0|
+-------------+---------------------+
Hope this helps!
Solution on Java
There are some Spark SQL functions which let you to play with the date format.
Conversion example : 20181224091530 -> 2018-12-24 09:15:30
Solution (Spark SQL statement) :
SELECT
...
to_timestamp(cast(DECIMAL_DATE as string),'yyyyMMddHHmmss') as `TIME STAMP DATE`,
...
FROM some_table
You can use the SQL statements by using an instance of org.apache.spark.sql.SparkSession. For example if you want to execute an sql statement, Spark provide the following solution:
...
// You have to create an instance of SparkSession
sparkSession.sql(sqlStatement);
...
Notes: