I\'m trying to setup a standalone Spark 2.0 server to process an analytics function in parallel. To do this I want to have a single worker with multiple executors.
I believe you mixed up local and standalone modes:
local
, local[*]
or local[n]
. spark.executor.cores
and spark.executor.cores
are not applicable in the local mode because there is only one embedded executor.Standalone mode requires a standalone Spark cluster. It requires a master node (can be started using SPARK_HOME/sbin/start-master.sh
script) and at least one worker node (can be started using SPARK_HOME/sbin/start-slave.sh
script).
SparkConf
should use master node address to create (spark://host:port
).