In pyspark is there a way to convert a dataframe column of timestamp datatype to a string of format \'YYYY-MM-DD\' format?
You can use date_format function as below
from pyspark.sql.functions import date_format
df.withColumn("dateColumn", date_format(col("vacationdate"), "yyyy-MM-dd"))
Hope this helps!
If you have a column with schema
as
root
|-- date: timestamp (nullable = true)
Then you can use from_unixtime
function to convert the timestamp to string after converting the timestamp to bigInt using unix_timestamp
function as
from pyspark.sql import functions as f
df.withColumn("date", f.from_unixtime(f.unix_timestamp(df.date), "yyyy-MM-dd"))
and you should have
root
|-- date: string (nullable = true)
from pyspark.sql.functions import date_format
df.withColumn("DateOnly", date_format('DateTime', "yyyy-MM-dd")).show()