I have a dataframe in Pyspark with a date column called \"report_date\".
I want to create a new column called \"report_date_10\" that is 10 days added to the origi
It seems you are using the pandas
syntax for adding a column; For spark, you need to use withColumn
to add a new column; For adding the date, there's the built in date_add
function:
import pyspark.sql.functions as F
df_dc = spark.createDataFrame([['2018-05-30']], ['report_date'])
df_dc.withColumn('report_date_10', F.date_add(df_dc['report_date'], 10)).show()
+-----------+--------------+
|report_date|report_date_10|
+-----------+--------------+
| 2018-05-30| 2018-06-09|
+-----------+--------------+