Hadoop “Unable to load native-hadoop library for your platform” warning

后端 未结 21 1081
礼貌的吻别
礼貌的吻别 2020-11-22 03:48

I\'m currently configuring hadoop on a server running CentOs. When I run start-dfs.sh or stop-dfs.sh, I get the following error:

相关标签:
21条回答
  • 2020-11-22 03:56

    I'm not using CentOS. Here is what I have in Ubuntu 16.04.2, hadoop-2.7.3, jdk1.8.0_121. Run start-dfs.sh or stop-dfs.sh successfully w/o error:

    # JAVA env
    #
    export JAVA_HOME=/j01/sys/jdk
    export JRE_HOME=/j01/sys/jdk/jre
    
    export PATH=${JAVA_HOME}/bin:${JRE_HOME}/bin:${PATH}:.
    
    # HADOOP env
    #
    export HADOOP_HOME=/j01/srv/hadoop
    export HADOOP_MAPRED_HOME=$HADOOP_HOME
    export HADOOP_COMMON_HOME=$HADOOP_HOME
    export HADOOP_HDFS_HOME=$HADOOP_HOME
    export YARN_HOME=$HADOOP_HOME
    
    export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop
    export PATH=$PATH:$HADOOP_HOME/sbin:$HADOOP_HOME/bin
    

    Replace /j01/sys/jdk, /j01/srv/hadoop with your installation path

    I also did the following for one time setup on Ubuntu, which eliminates the need to enter passwords for multiple times when running start-dfs.sh:

    sudo apt install openssh-server openssh-client
    ssh-keygen -t rsa
    ssh-copy-id user@localhost
    

    Replace user with your username

    0 讨论(0)
  • 2020-11-22 03:57

    Basically, it is not an error, it's a warning in the Hadoop cluster. Here just we update the environment variables.

    export HADOOP_OPTS = "$HADOOP_OPTS"-Djava.library.path = /usr/local/hadoop/lib
    
     export HADOOP_COMMON_LIB_NATIVE_DIR = "/usr/local/hadoop/lib/native"
    0 讨论(0)
  • 2020-11-22 03:58
    export HADOOP_HOME=/home/hadoop/hadoop-2.4.1  
    export PATH=$HADOOP_HOME/bin:$PATH  
    export HADOOP_PREFIX=$HADOOP_HOME  
    export HADOOP_COMMON_HOME=$HADOOP_PREFIX  
    export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_PREFIX/lib/native  
    export HADOOP_CONF_DIR=$HADOOP_PREFIX/etc/hadoop  
    export HADOOP_HDFS_HOME=$HADOOP_PREFIX  
    export HADOOP_MAPRED_HOME=$HADOOP_PREFIX  
    export HADOOP_YARN_HOME=$HADOOP_PREFIX  
    export JAVA_LIBRARY_PATH=$HADOOP_HOME/lib/native:$JAVA_LIBRARY_PATH
    
    0 讨论(0)
  • 2020-11-22 04:00

    In my case , after I build hadoop on my 64 bit Linux mint OS, I replaced the native library in hadoop/lib. Still the problem persist. Then I figured out the hadoop pointing to hadoop/lib not to the hadoop/lib/native. So I just moved all content from native library to its parent. And the warning just gone.

    0 讨论(0)
  • 2020-11-22 04:01

    Verified remedy from earlier postings:

    1) Checked that the libhadoop.so.1.0.0 shipped with the Hadoop distribution was compiled for my machine architecture, which is x86_64:

    [nova]:file /opt/hadoop-2.6.0/lib/native/libhadoop.so.1.0.0
    /opt/hadoop-2.6.0/lib/native/libhadoop.so.1.0.0: ELF 64-bit LSB shared object, x86-64, version 1 (SYSV), dynamically linked, BuildID[sha1]=3a80422c78d708c9a1666c1a8edd23676ed77dbb, not stripped
    

    2) Added -Djava.library.path=<path> to HADOOP_OPT in hadoop-env.sh:

    export HADOOP_OPTS="$HADOOP_OPTS -Djava.net.preferIPv4Stack=true -Djava.library.path=/opt/hadoop-2.6.0/lib/native"
    

    This indeed made the annoying warning disappear.

    0 讨论(0)
  • 2020-11-22 04:02

    I assume you're running Hadoop on 64bit CentOS. The reason you saw that warning is the native Hadoop library $HADOOP_HOME/lib/native/libhadoop.so.1.0.0 was actually compiled on 32 bit.

    Anyway, it's just a warning, and won't impact Hadoop's functionalities.

    Here is the way if you do want to eliminate this warning, download the source code of Hadoop and recompile libhadoop.so.1.0.0 on 64bit system, then replace the 32bit one.

    Steps on how to recompile source code are included here for Ubuntu:

    • http://www.ercoppa.org/Linux-Compile-Hadoop-220-fix-Unable-to-load-native-hadoop-library.htm

    Good luck.

    0 讨论(0)
提交回复
热议问题