Unable to start daemons using start-dfs.sh

前端 未结 1 1119
忘了有多久
忘了有多久 2021-01-13 10:06

We are using cdh4-0.0 distribution from cloudera. We are unable to start the daemons using the below command.

>start-dfs.sh
Starting namenodes on [localho         


        
1条回答
  •  被撕碎了的回忆
    2021-01-13 10:58

    Looks like you're using tarballs?

    Try to set an override the default HADOOP_LOG_DIR location in your etc/hadoop/hadoop-env.sh config file like so:

    export HADOOP_LOG_DIR=/path/to/hadoop/extract/logs/
    

    And then retry sbin/start-dfs.sh, and it should work.

    In packaged environments, the start-stop scripts are tuned to provide a unique location for each type of service, via the same HADOOP_LOG_DIR env-var, so they do not have the same issue you're seeing.

    If you are using packages instead, don't use these scripts and instead just do:

    service hadoop-hdfs-namenode start
    service hadoop-hdfs-datanode start
    service hadoop-hdfs-secondarynamenode start
    

    0 讨论(0)
提交回复
热议问题