Setting “spark.memory.storageFraction” in Spark does not work

拥有回忆 提交于 2021-02-19 02:08:41

问题


I am trying to tune the memory parameter of Spark. I tried:

sparkSession.conf.set("spark.memory.storageFraction","0.1") //sparkSession has been created

After I submit the job and checked Spark UI. I found "Storage Memory" is still as before. So the above did not work.

What is the correct way to set "spark.memory.storageFraction"?

I am using Spark 2.0.


回答1:


I face same problem , after read some code from spark github I think the "Storage Memory" on spark ui is misleading, it's not indicate the size of storage region,actually it represent the maxMemory:

maxMemory =  (executorMemory - reservedMemory[default 384]) * memoryFraction[default 0.6]

check these for more detail ↓↓↓

spark ui executors-page source code

getMaxmemory source code



来源:https://stackoverflow.com/questions/43512231/setting-spark-memory-storagefraction-in-spark-does-not-work

标签
易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!