Spark Configuration: SPARK_MEM vs. SPARK_WORKER_MEMORY

后端 未结 1 1369
Happy的楠姐
Happy的楠姐 2021-02-06 02:35

In spark-env.sh, it\'s possible to configure the following environment variables:

# - SPARK_WORKER_MEMORY, to set how much memory to use (e.g. 1000m, 2g)
export          


        
1条回答
  •  感情败类
    2021-02-06 03:01

    A standalone cluster can host multiple Spark clusters (each "cluster" is tied to a particular SparkContext). i.e. you can have one cluster running kmeans, one cluster running Shark, and another one running some interactive data mining.

    In this case, the 22GB is the total amount of memory you allocated to the Spark standalone cluster, and your particular instance of SparkContext is using 3GB per node. So you can create 6 more SparkContext's using up to 21GB.

    0 讨论(0)
提交回复
热议问题