I\'m currently configuring hadoop on a server running CentOs. When I run start-dfs.sh
or stop-dfs.sh
, I get the following error:
I'm not using CentOS. Here is what I have in Ubuntu 16.04.2, hadoop-2.7.3, jdk1.8.0_121. Run start-dfs.sh or stop-dfs.sh successfully w/o error:
# JAVA env
#
export JAVA_HOME=/j01/sys/jdk
export JRE_HOME=/j01/sys/jdk/jre
export PATH=${JAVA_HOME}/bin:${JRE_HOME}/bin:${PATH}:.
# HADOOP env
#
export HADOOP_HOME=/j01/srv/hadoop
export HADOOP_MAPRED_HOME=$HADOOP_HOME
export HADOOP_COMMON_HOME=$HADOOP_HOME
export HADOOP_HDFS_HOME=$HADOOP_HOME
export YARN_HOME=$HADOOP_HOME
export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop
export PATH=$PATH:$HADOOP_HOME/sbin:$HADOOP_HOME/bin
Replace /j01/sys/jdk, /j01/srv/hadoop with your installation path
I also did the following for one time setup on Ubuntu, which eliminates the need to enter passwords for multiple times when running start-dfs.sh:
sudo apt install openssh-server openssh-client
ssh-keygen -t rsa
ssh-copy-id user@localhost
Replace user with your username