I use Spark 2.2.0-rc1.
I\'ve got a Kafka topic
which I\'m querying a running watermarked aggregation, with a 1 minute
watermark, giving out to
Pushing more data to Kafka should trigger Spark to output something. The current behavior is totally because of the internal implementation.
When you push some data, StreamingQuery will generate a batch to run. When this batch finishes, it will remember the max event time in this batch. Then in the next batch,
because you are using append
mode, StreamingQuery will use the max event time and watermark to evict old values from StateStore and output it. Therefore you need to make sure generating at least two batches in order to see output.