I am using the following code for creating / inserting data into a Hive table in Spark SQL:
val sc = SparkSession .builder() .appName(\"App\") .master(
You should just need to run MSCK REPAIR TABLE when adding new partitions. See Hive docs: https://cwiki.apache.org/confluence/display/Hive/LanguageManual+DDL#LanguageManualDDL-RecoverPartitions(MSCKREPAIRTABLE)
MSCK REPAIR TABLE