Snappy Compression not working due to tmp folder previliges

流过昼夜 提交于 2019-12-24 09:39:12

问题


I have a problem whenever i am trying to store my data in a compressed format with pig, Sqoop, or Spark. I know the problem is with mounting our tmp folder to nonexec and this causes for instance snappy to give me this error:

java.lang.IllegalArgumentException: java.lang.UnsatisfiedLinkError: /tmp/snappy-1.1.2-fe4e30d0-e4a5-4b1a-ae31-fd1861117288-libsnappyjava.so: /tmp/snappy-1.1.2-fe4e30d0-e4a5-4b1a-ae31-fd1861117288-libsnappyjava.so: failed to map segment from shared object: Operation not permitted

The solutions that i found in the internet is that either mount the tmp folder to exec which is not an option for me as the sysadmin won't allow it due to security concerns.The other option is to change the java opts execution path to some other paths instead of tmp.

I have tried the following approach, but it didn't solve the problem. add these lines to hadoop-env.sh and sqoop-env

export HADOOP_OPTS="$HADOOP_OPTS -Dorg.xerial.snappy.tempdir=/newpath" export HADOOP_OPTS="$HADOOP_OPTS -Djava.io.tmpdir=/newpath"

I would appreciate if you guys have any other solutions that could solve the issue.

Thanks


回答1:


For other users with this issue, try starting Hive with

hive --hiveconf org.xerial.snappy.tempdir=/../

and supply a location that can execute



来源:https://stackoverflow.com/questions/47211407/snappy-compression-not-working-due-to-tmp-folder-previliges

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!