hadoop mapreduce: java.lang.UnsatisfiedLinkError: org.apache.hadoop.util.NativeCodeLoader.buildSupportsSnappy()Z

六眼飞鱼酱① 提交于 2019-11-29 02:20:26
Pradeep Jawahar

Found the following information from the Cloudera Communities

  1. Ensure that LD_LIBRARY_PATH and JAVA_LIBRARY_PATH contains the native directory path having the libsnappy.so** files.
  2. Ensure that LD_LIBRARY_PATH and JAVA_LIBRARY path have been exported in the SPARK environment(spark-env.sh).

For example I use Hortonworks HDP and I have the following configuration in my spark-env.sh

export JAVA_LIBRARY_PATH=$JAVA_LIBRARY_PATH:/usr/hdp/2.2.0.0-2041/hadoop/lib/native
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/hdp/2.2.0.0-2041/hadoop/lib/native
export SPARK_YARN_USER_ENV="JAVA_LIBRARY_PATH=$JAVA_LIBRARY_PATH,LD_LIBRARY_PATH=$LD_LIBRARY_PATH"

check your core-site.xml and mapred-site.xml they should contain correct properties and path of the folder with libraries

core-site.xml

<property>
  <name>io.compression.codecs</name>
<value>org.apache.hadoop.io.compress.GzipCodec,org.apache.hadoop.io.compress.DefaultCodec,org.apache.hadoop.io.compress.SnappyCodec</value>
</property>

mapred-site.xml

 <property>
      <name>mapreduce.map.output.compress</name>
      <value>true</value>
    </property>

    <property>
     <name>mapred.map.output.compress.codec</name>  
     <value>org.apache.hadoop.io.compress.SnappyCodec</value>
    </property>


    <property>
      <name>mapreduce.admin.user.env</name>
      <value>LD_LIBRARY_PATH=/usr/hdp/2.2.0.0-1084/hadoop/lib/native</value>
    </property>

LD_LIBRARY_PATH - has to contain path of libsnappy.so .

My problem was that my JRE did not contain the appropriate native libraries. This may or may not be because I switched the JDK from cloudera's pre-built VM to JDK 1.7. The snappy .so files are in your hadoop/lib/native directory, the JRE needs to have them. Adding them to the classpath did not seem to resolve my issue. I resolved it like this:

$ cd /usr/lib/hadoop/lib/native
$ sudo cp *.so /usr/java/latest/jre/lib/amd64/

Then I was able to use the SnappyCodec class. Your paths may be different though.

That seemed to get me to the next problem:

Caused by: java.lang.RuntimeException: native snappy library not available: SnappyCompressor has not been loaded.

Still trying to resolve that.

I you need all files, not only the *.so ones. Also ideally you would include the folder to your path instead of copying the libs from there. You need to restart the MapReduce service after this, so that the new libraries are taken and can be used.

Niko

after removing hadoop.dll (which i copied manually) from windows\system32 and setting up HADOOP_HOME=\hadoop-2.6.4 IT WORKS!!!

In my case, you may check the hive-conf files : mapred-site.xml , and check the key: mapreduce.admin.user.env 's value,

I tested it in a new datanode, and received unlinked-buildSnappy error on the machine where is no native dependencies ( libsnappy.so , etc)

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!