What is the relationship between workers, worker instances, and executors?
问题 In Spark Standalone mode , there are master and worker nodes. Here are few questions: Does 2 worker instance mean one worker node with 2 worker processes? Does every worker instance hold an executor for specific application (which manages storage, task) or one worker node holds one executor? Is there a flow chart explain how spark runtime, such as word count? 回答1: I suggest reading the Spark cluster docs first, but even more so this Cloudera blog post explaining these modes. Your first