Why is this simple Spark program not utlizing multiple cores?

前端 未结 4 1133
情深已故
情深已故 2021-02-10 01:39

So, I\'m running this simple program on a 16 core multicore system. I run it by issuing the following.

spark-submit --master local[*] pi.py

And

4条回答
  •  遇见更好的自我
    2021-02-10 02:14

    To change the CPU core consumption, set the number of cores to be used by the workers in the spark-env.sh file in spark-installation-directory/conf This is done with the SPARK_EXECUTOR_CORES attribute in spark-env.sh file. The value is set to 1 by default.

提交回复
热议问题