问题
I have a Apache Spark 1.6.1 standalone cluster set on a single machine with the following specifications:
- CPU: Core i7-4790 (# of cores: 4, # of threads: 8)
- RAM: 16GB
I set nothing so Spark can take the default values, which for cores is "all the available cores", based on that, the question is:
Why is Spark detecting 8 cores, when I only have 4?
回答1:
I assume that setting all available cores
means that Spark is also using Virtual cores
And since your CPU does support Hyperthreading it has 8 virtual cores available.
If you want to only use physical cores I assume there is a specific setting for that.
来源:https://stackoverflow.com/questions/37632396/why-is-spark-detecting-8-cores-when-i-only-have-4