I am using the following code for creating / inserting data into a Hive table in Spark SQL:
val sc = SparkSession
.builder()
.appName(\"App\")
.master(
You should just need to run MSCK REPAIR TABLE
when adding new partitions. See Hive docs: https://cwiki.apache.org/confluence/display/Hive/LanguageManual+DDL#LanguageManualDDL-RecoverPartitions(MSCKREPAIRTABLE)
Spark SQL partitioning is not compatible with Hive. This issue is documented by SPARK-14927.
As a recommended workaround you can create partitioned table with Hive, and only insert from Spark.