Apache Spark: “failed to launch org.apache.spark.deploy.worker.Worker” or Master

后端 未结 2 424
孤街浪徒
孤街浪徒 2021-02-03 11:54

I have created a Spark cluster on Openstack running on Ubuntu14.04 with 8gb of ram. I created two virtual machines with 3gb each (keeping 2 gb for the parent OS). Further, i cre

2条回答
  •  太阳男子
    2021-02-03 12:39

    I have the same problem, running spark/sbin/start-slave.sh on master node.

    hadoop@master:/opt/spark$ sudo ./sbin/start-slave.sh --master spark://master:7077
    starting org.apache.spark.deploy.worker.Worker, logging to /opt/spark/logs/spark-root-org.apache.spark.deploy.worker.Worker-1-master.out
    failed to launch: nice -n 0 /opt/spark/bin/spark-class org.apache.spark.deploy.worker.Worker --webui-port 8081 --master spark://master:7077
      Options:
        -c CORES, --cores CORES  Number of cores to use
        -m MEM, --memory MEM     Amount of memory to use (e.g. 1000M, 2G)
        -d DIR, --work-dir DIR   Directory to run apps in (default: SPARK_HOME/work)
        -i HOST, --ip IP         Hostname to listen on (deprecated, please use --host or -h)
        -h HOST, --host HOST     Hostname to listen on
        -p PORT, --port PORT     Port to listen on (default: random)
        --webui-port PORT        Port for web UI (default: 8081)
        --properties-file FILE   Path to a custom Spark properties file.
                                 Default is conf/spark-defaults.conf.
    full log in /opt/spark/logs/spark-root-org.apache.spark.deploy.worker.Worker-1-master.out
    

    I found my fault, I should not use --master keyword and just run command

    hadoop@master:/opt/spark$ sudo ./sbin/start-slave.sh spark://master:7077
    

    following the steps of this tutorial: https://phoenixnap.com/kb/install-spark-on-ubuntu

    Hint: make sure to install all dependencies before:

    sudo apt install scala git -y
    

提交回复
热议问题