I am converting string to datetime field using joda.time.Datetime libraries but it throws unsupported exception Here is main class code:
//create new var with
Scala spark schema doesnot support datetime explicitly. You can explore other options. They are:
Convert datetime to millis and you can maintain in Long format .
Convert datetime to unixtime (java format) https://stackoverflow.com/a/44957376/9083843
Convert datetime to string. you can change back to joda datetime at any moment using DateTime.parse("stringdatetime")
If you still want to maintain in joda datetime in scala schema then you can convert your dataframe to sequence
dataframe.rdd.map(r =>DateTime.parse(r(0).toString())).collect().toSeq
Thanks zero323 for the solution. I used java.sql.Timestamp and here is the code I modified
val dateYMD: java.sql.Timestamp = new java.sql.Timestamp(DateTimeFormat.forPattern("yyyy-MM-dd HH:mm:ss").parseDateTime(p(8)).getMillis)
testData(dateYMD)}.toDF().show()
and changed my class to
case class testData(GamingDate: java.sql.Timestamp) { }
As you can read in the official documentation dates in Spark SQL are represented using java.sql.Timestamp
. If you want to use Joda time you have to convert output to the correct type
SparkSQL can easily handle standard date formats using type casting:
sc.parallelize(Seq(Tuple1("2016-01-11 00:01:02")))
.toDF("dt")
.select($"dt".cast("timestamp"))