问题
I have been using Spark + Python to finish some works, it's great, but I have a question in my mind:
where is the spark job of transformation and action done? Is transformation job done in Spark Master ( or Driver ) while action job is done in Workers ( Executors ), or both of them are done in Workers ( Executors )
Thanks
回答1:
Workers (aka slaves) are running Spark instances where executors live to execute tasks.
Transformations are performed at the worker, when the action method is called the computed data is brought back to the driver.
An application in Spark
is executed in three steps:
1.Create RDD graph, i.e. DAG (directed acyclic graph)
of RDDs to represent entire computation.
2.Create stage graph, i.e. a DAG of stages
that is a logical execution plan based on the RDD graph. Stages are created by breaking the RDD graph at shuffle boundaries.
3.Based on the plan, schedule and execute
tasks on workers.
回答2:
Transformations run at executors.
Actions run at executors and driver. Most of the work is still happening in the executors but the final steps like reducing outputs is executed in the driver.
回答3:
When any action is called on the RDD, Spark creates the DAG and submits to the DAG scheduler.
The DAG scheduler divides operators into stages of tasks. A stage is comprised of tasks based on partitions of the input data. The DAG scheduler pipelines operators together.
The Stages are passed on to the Task Scheduler.The task scheduler launches tasks via cluster manager.(Spark Standalone/Yarn/Mesos). The task scheduler doesn't know about dependencies of the stages.
The tasks(transformation) executes on the Workers(Executors) and when action(take/collect) is called it brings back the data at the Driver.
来源:https://stackoverflow.com/questions/39946878/where-is-the-spark-job-of-transformation-and-action-done