I downloaded: spark-2.1.0-bin-hadoop2.7.tgz
from http://spark.apache.org/downloads.html. I have Hadoop HDFS and YARN started with $ start-dfs.sh
and
I met the same problem with you. When I check the NodeManager log,I find this warn:
2017-10-26 19:43:21,787 WARN org.apache.hadoop.yarn.server.nodemanager.containermanager.monitor.ContainersMonitorImpl: Container [pid=3820,containerID=container_1509016963775_0001_02_000001] is running beyond virtual memory limits. Current usage: 339.0 MB of 1 GB physical memory used; 2.2 GB of 2.1 GB virtual memory used. Killing container.
So I set a bigger virtual memory(yarn.nodemanager.vmem-pmem-ratio in yarn-site.xml, which default value is 2.1). Then it really worked.