Spark SQL saveAsTable returns empty result

后端 未结 2 498
梦毁少年i
梦毁少年i 2021-01-20 17:00

I am using the following code for creating / inserting data into a Hive table in Spark SQL:

val sc = SparkSession
  .builder()
  .appName(\"App\")
  .master(         


        
相关标签:
2条回答
  • 2021-01-20 17:59

    You should just need to run MSCK REPAIR TABLE when adding new partitions. See Hive docs: https://cwiki.apache.org/confluence/display/Hive/LanguageManual+DDL#LanguageManualDDL-RecoverPartitions(MSCKREPAIRTABLE)

    0 讨论(0)
  • 2021-01-20 18:02

    Spark SQL partitioning is not compatible with Hive. This issue is documented by SPARK-14927.

    As a recommended workaround you can create partitioned table with Hive, and only insert from Spark.

    0 讨论(0)
提交回复
热议问题