Apache Spark running spark-shell on YARN error

后端 未结 2 1461
时光说笑
时光说笑 2021-02-06 10:37

I downloaded: spark-2.1.0-bin-hadoop2.7.tgz from http://spark.apache.org/downloads.html. I have Hadoop HDFS and YARN started with $ start-dfs.sh and

2条回答
  •  -上瘾入骨i
    2021-02-06 11:09

    I met the same problem with you. When I check the NodeManager log,I find this warn:

    2017-10-26 19:43:21,787 WARN org.apache.hadoop.yarn.server.nodemanager.containermanager.monitor.ContainersMonitorImpl: Container [pid=3820,containerID=container_1509016963775_0001_02_000001] is running beyond virtual memory limits. Current usage: 339.0 MB of 1 GB physical memory used; 2.2 GB of 2.1 GB virtual memory used. Killing container.

    So I set a bigger virtual memory(yarn.nodemanager.vmem-pmem-ratio in yarn-site.xml, which default value is 2.1). Then it really worked.

提交回复
热议问题