Setting “spark.memory.storageFraction” in Spark does not work
问题 I am trying to tune the memory parameter of Spark. I tried: sparkSession.conf.set("spark.memory.storageFraction","0.1") //sparkSession has been created After I submit the job and checked Spark UI. I found "Storage Memory" is still as before. So the above did not work. What is the correct way to set "spark.memory.storageFraction"? I am using Spark 2.0. 回答1: I face same problem , after read some code from spark github I think the "Storage Memory" on spark ui is misleading, it's not indicate the