Hadoop “Unable to load native-hadoop library for your platform” warning

后端 未结 21 1084
礼貌的吻别
礼貌的吻别 2020-11-22 03:48

I\'m currently configuring hadoop on a server running CentOs. When I run start-dfs.sh or stop-dfs.sh, I get the following error:

相关标签:
21条回答
  • 2020-11-22 04:11

    The answer depends... I just installed Hadoop 2.6 from tarball on 64-bit CentOS 6.6. The Hadoop install did indeed come with a prebuilt 64-bit native library. For my install, it is here:

    /opt/hadoop/lib/native/libhadoop.so.1.0.0
    

    And I know it is 64-bit:

    [hadoop@VMWHADTEST01 native]$ ldd libhadoop.so.1.0.0
    ./libhadoop.so.1.0.0: /lib64/libc.so.6: version `GLIBC_2.14' not found (required by ./libhadoop.so.1.0.0)
    linux-vdso.so.1 =>  (0x00007fff43510000)
    libdl.so.2 => /lib64/libdl.so.2 (0x00007f9be553a000)
    libc.so.6 => /lib64/libc.so.6 (0x00007f9be51a5000)
    /lib64/ld-linux-x86-64.so.2 (0x00007f9be5966000)
    

    Unfortunately, I stupidly overlooked the answer right there staring me in the face as I was focuses on, "Is this library 32 pr 64 bit?":

    `GLIBC_2.14' not found (required by ./libhadoop.so.1.0.0)
    

    So, lesson learned. Anyway, the rest at least led me to being able to suppress the warning. So I continued and did everything recommended in the other answers to provide the library path using the HADOOP_OPTS environment variable to no avail. So I looked at the source code. The module that generates the error tells you the hint (util.NativeCodeLoader):

    15/06/18 18:59:23 WARN util.NativeCodeLoader: Unable to load native-hadoop    library for your platform... using builtin-java classes where applicable
    

    So, off to here to see what it does:

    http://grepcode.com/file/repo1.maven.org/maven2/com.ning/metrics.action/0.2.6/org/apache/hadoop/util/NativeCodeLoader.java/

    Ah, there is some debug level logging - let's turn that on a see if we get some additional help. This is done by adding the following line to $HADOOP_CONF_DIR/log4j.properties file:

    log4j.logger.org.apache.hadoop.util.NativeCodeLoader=DEBUG
    

    Then I ran a command that generates the original warning, like stop-dfs.sh, and got this goodie:

    15/06/18 19:05:19 DEBUG util.NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: /opt/hadoop/lib/native/libhadoop.so.1.0.0: /lib64/libc.so.6: version `GLIBC_2.14' not found (required by /opt/hadoop/lib/native/libhadoop.so.1.0.0)
    

    And the answer is revealed in this snippet of the debug message (the same thing that the previous ldd command 'tried' to tell me:

    `GLIBC_2.14' not found (required by opt/hadoop/lib/native/libhadoop.so.1.0.0)
    

    What version of GLIBC do I have? Here's simple trick to find out:

    [hadoop@VMWHADTEST01 hadoop]$ ldd --version
    ldd (GNU libc) 2.12
    

    So, can't update my OS to 2.14. Only solution is to build the native libraries from sources on my OS or suppress the warning and just ignore it for now. I opted to just suppress the annoying warning for now (but do plan to build from sources in the future) buy using the same logging options we used to get the debug message, except now, just make it ERROR level.

    log4j.logger.org.apache.hadoop.util.NativeCodeLoader=ERROR
    

    I hope this helps others see that a big benefit of open source software is that you can figure this stuff out if you take some simple logical steps.

    0 讨论(0)
  • 2020-11-22 04:13

    For those on OSX with Hadoop installed via Homebrew, follow these steps replacing the path and Hadoop version where appropriate

    wget http://www.eu.apache.org/dist/hadoop/common/hadoop-2.7.1/hadoop-2.7.1-src.tar.gz
    tar xvf hadoop-2.7.1-src.tar.gz
    cd hadoop-2.7.1-src
    mvn package -Pdist,native -DskipTests -Dtar
    mv lib /usr/local/Cellar/hadoop/2.7.1/
    

    then update hadoop-env.sh with

    export HADOOP_OPTS="$HADOOP_OPTS -Djava.net.preferIPv4Stack=true -Djava.security.krb5.realm= -Djava.security.krb5.kdc= -Djava.library.path=/usr/local/Cellar/hadoop/2.7.1/lib/native"
    
    0 讨论(0)
  • 2020-11-22 04:13

    I had the same problem with JDK6,I changed the JDK to JDK8,the problem solved. Try to use JDK8!!!

    0 讨论(0)
  • 2020-11-22 04:14

    Firstly: You can modify the glibc version.CentOS provides safe softwares tranditionally,it also means the version is old such as glibc,protobuf ...

    ldd --version
    ldd /opt/hadoop/lib/native/libhadoop.so.1.0.0
    

    You can compare the version of current glibc with needed glibc.

    Secondly: If the version of current glibc is old,you can update the glibc. DownLoad Glibc

    If the version of current glibc id right,you can append word native to your HADOOP_OPTS

    export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
    export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"
    
    0 讨论(0)
  • 2020-11-22 04:15

    After a continuous research as suggested by KotiI got resolved the issue.

    hduser@ubuntu:~$ cd /usr/local/hadoop
    
    hduser@ubuntu:/usr/local/hadoop$ ls
    
    bin  include  libexec      logs        README.txt  share
    etc  lib      LICENSE.txt  NOTICE.txt  sbin
    
    hduser@ubuntu:/usr/local/hadoop$ cd lib
    
    hduser@ubuntu:/usr/local/hadoop/lib$ ls
    native
    
    hduser@ubuntu:/usr/local/hadoop/lib$ cd native/
    
    hduser@ubuntu:/usr/local/hadoop/lib/native$ ls
    
    libhadoop.a       libhadoop.so        libhadooputils.a  libhdfs.so
    libhadooppipes.a  libhadoop.so.1.0.0  libhdfs.a         libhdfs.so.0.0.0
    
    hduser@ubuntu:/usr/local/hadoop/lib/native$ sudo mv * ../
    

    Cheers

    0 讨论(0)
  • 2020-11-22 04:15

    Move your compiled native library files to $HADOOP_HOME/lib folder.

    Then set your environment variables by editing .bashrc file

    export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib  
    export HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=$HADOOP_HOME/lib"
    

    Make sure your compiled native library files are in $HADOOP_HOME/lib folder.

    it should work.

    0 讨论(0)
提交回复
热议问题