unsupportedOperationException Error converting string to DateTime using Joda time

后端 未结 3 1217
有刺的猬
有刺的猬 2021-01-21 19:16

I am converting string to datetime field using joda.time.Datetime libraries but it throws unsupported exception Here is main class code:

//create new var with          


        
相关标签:
3条回答
  • 2021-01-21 19:33

    Scala spark schema doesnot support datetime explicitly. You can explore other options. They are:

    1. Convert datetime to millis and you can maintain in Long format .

    2. Convert datetime to unixtime (java format) https://stackoverflow.com/a/44957376/9083843

    3. Convert datetime to string. you can change back to joda datetime at any moment using DateTime.parse("stringdatetime")

    4. If you still want to maintain in joda datetime in scala schema then you can convert your dataframe to sequence

      dataframe.rdd.map(r =>DateTime.parse(r(0).toString())).collect().toSeq

    0 讨论(0)
  • 2021-01-21 19:49

    Thanks zero323 for the solution. I used java.sql.Timestamp and here is the code I modified

    val dateYMD: java.sql.Timestamp = new java.sql.Timestamp(DateTimeFormat.forPattern("yyyy-MM-dd HH:mm:ss").parseDateTime(p(8)).getMillis)
    testData(dateYMD)}.toDF().show()
    

    and changed my class to

    case class testData(GamingDate: java.sql.Timestamp) { }
    
    0 讨论(0)
  • 2021-01-21 19:59
    1. As you can read in the official documentation dates in Spark SQL are represented using java.sql.Timestamp. If you want to use Joda time you have to convert output to the correct type

    2. SparkSQL can easily handle standard date formats using type casting:

      sc.parallelize(Seq(Tuple1("2016-01-11 00:01:02")))
        .toDF("dt")
        .select($"dt".cast("timestamp"))
      
    0 讨论(0)
提交回复
热议问题