问题
I created a dataframe using sqlContext and I have a problem with the datetime format as it is identified as string.
df2 = sqlContext.createDataFrame(i[1])
df2.show
df2.printSchema()
Result:
2016-07-05T17:42:55.238544+0900
2016-07-05T17:17:38.842567+0900
2016-06-16T19:54:09.546626+0900
2016-07-05T17:27:29.227750+0900
2016-07-05T18:44:12.319332+0900
string (nullable = true)
Since the datetime schema is a string, I want to change it to datetime format as follows:
df3 = df2.withColumn('_1', df2['_1'].cast(datetime()))
Here I got an error: TypeError: Required argument 'year' (pos 1) not found
What should I do to solve this problem?
回答1:
Try this:
from pyspark.sql.types import DateType
ndf = df2.withColumn('_1', df2['_1'].cast(DateType()))
来源:https://stackoverflow.com/questions/39198062/how-to-convert-datetime-from-string-format-into-datetime-format-in-pyspark