Enable case sensitivity for spark.sql globally

后端 未结 3 1172
南笙
南笙 2020-12-07 01:08

The option spark.sql.caseSensitive controls whether column names etc should be case sensitive or not. It can be set e.g. by

spark_session.sql(\'         


        
相关标签:
3条回答
  • Yet another way for PySpark. Using a SparkSession object named spark:

    spark.conf.set('spark.sql.caseSensitive', True)
    
    0 讨论(0)
  • 2020-12-07 01:52

    As it turns out setting

    spark.sql.caseSensitive: True
    

    in $SPARK_HOME/conf/spark-defaults.conf DOES work after all. It just has to be done in the configuration of the Spark driver as well, not the master or workers. Apparently I forgot that when I last tried.

    0 讨论(0)
  • 2020-12-07 01:56

    Try sqlContext.sql("set spark.sql.caseSensitive=true") in your Python code, which worked for me.

    0 讨论(0)
提交回复
热议问题