问题
I am trying to tune the memory parameter of Spark. I tried:
sparkSession.conf.set("spark.memory.storageFraction","0.1") //sparkSession has been created
After I submit the job and checked Spark UI. I found "Storage Memory" is still as before. So the above did not work.
What is the correct way to set "spark.memory.storageFraction"?
I am using Spark 2.0.
回答1:
I face same problem , after read some code from spark github I think the "Storage Memory" on spark ui is misleading, it's not indicate the size of storage region,actually it represent the maxMemory:
maxMemory = (executorMemory - reservedMemory[default 384]) * memoryFraction[default 0.6]
check these for more detail ↓↓↓
spark ui executors-page source code
getMaxmemory source code
来源:https://stackoverflow.com/questions/43512231/setting-spark-memory-storagefraction-in-spark-does-not-work