Disable parquet metadata summary in Spark

前端 未结 2 408
既然无缘
既然无缘 2021-01-02 10:30

I have a spark job (for 1.4.1) receiving a stream of kafka events. I would like to save them continuously as parquet on tachyon.

val lines = KafkaUtils.creat         


        
相关标签:
2条回答
  • 2021-01-02 11:27

    setting "parquet.enable.summary-metadata" as text ("false" and not false) seems to work for us.

    By the way Spark does use the _common_metadata file (we copy that over manually for repetitive jobs)

    0 讨论(0)
  • 2021-01-02 11:36

    Spark 2.0 doesn't save metadata summaries by default any more, see SPARK-15719.

    If you are working with data hosted in S3, you may still find parquet performance hit by parquet itself trying to scan the tail of all objects to check their schemas. That can be disabled explicitly

    sparkConf.set("spark.sql.parquet.mergeSchema", "false")
    
    0 讨论(0)
提交回复
热议问题