Spark Streaming: Read JSON from Kafka and add event_time

后端 未结 1 2003
悲&欢浪女
悲&欢浪女 2021-01-26 07:49

I am trying to write a Stateful Spark Structured Streaming job that reads from Kafka. As part of the requirement I need to add \'event_time\' to my stream as an additional colum

相关标签:
1条回答
  • 2021-01-26 08:26

    I believe I was able to solve this using this:

    val withEventTime = df.withColumn("event_time",to_timestamp(col("value. arrivalTime")))
    

    Not sure why this worked & not the other one.

    0 讨论(0)
提交回复
热议问题