How to avoid empty files while writing parquet files?

后端 未结 4 878
囚心锁ツ
囚心锁ツ 2021-01-16 07:15

I am reading from Kafka queue using Spark Structured Streaming. After reading from Kafka I am applying filter on the dataframe. I am saving this fi

4条回答
  •  被撕碎了的回忆
    2021-01-16 07:50

    If you are using yarn client mode, then setting the num of executor cores to 1 will solve the problem. This means that only 1 task will be run at any time per executor.

提交回复
热议问题