I am trying to split my Date Column which is a String Type right now into 3 columns Year, Month and Date. I use (PySpark):
split_date=pyspark.sql.functions.split
You just need little bit of extra coding to recognize type of date format . for example, lets say your data is in below format -
data = [("2008-05-01",1),("2018-01-01",2),("03/14/2017",3),("01/01/2018",4)]
df = spark.createDataFrame(data,schema=['date','key'])
df.show()
:
+----------+---+
| date|key|
+----------+---+
|2008-05-01| 1|
|2018-01-01| 2|
|03/14/2017| 3|
|01/01/2018| 4|
+----------+---+
:
from pyspark.sql.functions import *
from pyspark.sql.types import *
# udf that recognise pattern and return list of year,month and day
def splitUDF(row):
if "/" in row:
mm,dd,yyyy = row.split("/")
elif "-" in row:
yyyy,mm,dd = row.split("-")
return [yyyy,mm,dd]
datSplitterUDF = udf(lambda row : splitUDF(row),ArrayType(StringType()))
df\
.select(datSplitterUDF(df.date).alias("dt"))\
.withColumn('year',col('dt').getItem(0).cast('int'))\
.withColumn('month',col('dt').getItem(1).cast('int'))\
.withColumn('day',col('dt').getItem(2).cast('int'))\
.show()
output:
+--------------+----+-----+---+
| dt|year|month|day|
+--------------+----+-----+---+
|[2008, 05, 01]|2008| 5| 1|
|[2018, 01, 01]|2018| 1| 1|
|[2017, 03, 14]|2017| 3| 14|
|[2018, 01, 01]|2018| 1| 1|
+--------------+----+-----+---+