Ignoring non-spark config property: hive.exec.dynamic.partition.mode

人盡茶涼 提交于 2020-07-06 10:59:30

问题


How to run a Spark-shell with hive.exec.dynamic.partition.mode=nonstrict?

I try (as suggested here)

  export SPARK_MAJOR_VERSION=2; spark-shell  --conf "hive.exec.dynamic.partition.mode=nonstrict" --properties-file /opt/_myPath_/sparkShell.conf'

but Warning "Ignoring non-spark config property: hive.exec.dynamic.partition.mode=nonstrict"


PS: using Spark version 2.2.0.2.6.4.0-91, Scala version 2.11.8

NOTE

The demand arrives after error on df.write.mode("overwrite").insertInto("db.partitionedTable"),

org.apache.spark.SparkException: Dynamic partition strict mode requires at least one static partition column. To turn this off set hive.exec.dynamic.partition.mode=nonstrict


回答1:


You can try using spark.hadoop.* prefix as suggested in Custom Spark Configuration section for version 2.3. Might work as well in 2.2 if it was just a doc bug :)

spark-shell \
  --conf "spark.hadoop.hive.exec.dynamic.partition=true" \
  --conf "spark.hadoop.hive.exec.dynamic.partition.mode=nonstrict" \
  ...



回答2:


I had the same problem and only found the workaround to set the config directly in the process before writing like

spark.conf.set("hive.exec.dynamic.partition.mode", "nonstrict")
df.write(...)


来源:https://stackoverflow.com/questions/58633753/ignoring-non-spark-config-property-hive-exec-dynamic-partition-mode

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!