I am trying to write a Stateful Spark Structured Streaming job that reads from Kafka. As part of the requirement I need to add \'event_time\' to my stream as an additional colum
I believe I was able to solve this using this:
val withEventTime = df.withColumn("event_time",to_timestamp(col("value. arrivalTime")))
Not sure why this worked & not the other one.