“Bad substitution” when submitting spark job to yarn-cluster

后端 未结 6 1530
萌比男神i
萌比男神i 2020-12-31 07:32

I am doing a smoke test against a yarn cluster using yarn-cluster as the master with the SparkPi example program. Here is the command line:

<
相关标签:
6条回答
  • 2020-12-31 08:09

    If you are using spark with hdp, then you have to do the following things:

    Add these entries in $SPARK_HOME/conf/spark-defaults.conf

    spark.driver.extraJavaOptions -Dhdp.version=2.2.0.0-2041 (your installed HDP version)
    
    spark.yarn.am.extraJavaOptions -Dhdp.version=2.2.0.0-2041 (your installed HDP version)
    

    Create a file called java-opts in $SPARK_HOME/conf and add the installed HDP version to that file like this:

    -Dhdp.version=2.2.0.0-2041 (your installed HDP version)
    

    To figure out which hdp version is installed, please run this command in the cluster:

    hdp-select status hadoop-client
    
    0 讨论(0)
  • 2020-12-31 08:10

    I also had this Issue using BigInsights 4.2.0.0 with yarn, spark and mapreduce 2 and what was causing it was the iop.version. To fix it you have to add the iop.version variable to mapred-site, and this can be done into with the following steps:

    In Ambari Server go to:

    • MAPREDUCE2
    • Configs (tab)
    • Advanced (tab)
    • Click into Custom mapred-site
    • Add Property...
    • Put iop.version and your BigInsights version.
    • Restart all services.

    This has fixed it.

    0 讨论(0)
  • 2020-12-31 08:15

    It is caused by hdp.version not getting substituted correctly. You have to set hdp.version in the file java-opts under $SPARK_HOME/conf.

    And you have to set

    spark.driver.extraJavaOptions -Dhdp.version=XXX 
    spark.yarn.am.extraJavaOptions -Dhdp.version=XXX
    

    in spark-defaults.conf under $SPARK_HOME/conf where XXX is the version of hdp.

    0 讨论(0)
  • 2020-12-31 08:17

    I had the same issue:

    launch_container.sh: line 24: $PWD:$PWD/__hadoop_conf__:$PWD/__spark__.jar:$HADOOP_CONF_DIR:/usr/hdp/current/hadoop-client/*::$PWD/mr-framework/hadoop/share/hadoop/tools/lib/*:/usr/hdp/${hdp.version}/hadoop/lib/hadoop-lzo-.6.0.${hdp.version}.jar:/etc/hadoop/conf/secure: bad substitution
    

    As I couldn't find any /usr/hdp/${hdp.version}/hadoop/lib/hadoop-lzo* file, I just edited mapred-site.xml and removed "/usr/hdp/${hdp.version}/hadoop/lib/hadoop-lzo-0.6.0.${hdp.version}.jar:"

    0 讨论(0)
  • 2020-12-31 08:23
    1. Go to the ambari-yarn.

    click on Configs->Advanced->Custom yarn-site->Add Property ...

    add hdp version as key and value as your HDP version. You will get hdp version along with below command

    hdp-select versions

    e.g. 2.5.3.0-37

    Now add you property as

    hdp.version=2.5.3.0-37

    1. Otherwise replace ${hdp.version} as your hdp version(2.5.3.0-37) in yarn-site.xml and yarn-env.sh
    0 讨论(0)
  • 2020-12-31 08:30

    This may be caused by /bin/sh linked to dash, instead of bash, which often happens on Debian based systems.

    To fix it, run sudo dpkg-reconfigure dash and select no.

    0 讨论(0)
提交回复
热议问题