save Spark dataframe to Hive: table not readable because “parquet not a SequenceFile”

前端 未结 4 1205
走了就别回头了
走了就别回头了 2020-12-28 22:08

I\'d like to save data in a Spark (v 1.3.0) dataframe to a Hive table using PySpark.

The documentation states:

\"spark.sql.hive.convertMetasto

4条回答
  •  时光说笑
    2020-12-28 22:44

    metadata doesn't already exist. In other words, it will add any partitions that exist on HDFS but not in metastore, to the hive metastore.

提交回复
热议问题