The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rw-rw-rw- (on Windows)

后端 未结 16 1040
星月不相逢
星月不相逢 2020-11-28 04:45

I am running Spark on Windows 7. When I use Hive, I see the following error

The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions a         


        
相关标签:
16条回答
  • 2020-11-28 05:25

    Next solution worked on Windows for me:

    • First, I defined HADOOP_HOME. It described in detail here
    • Next, I did like Nishu Tayal, but with one difference:C:\temp\hadoop\bin\winutils.exe chmod 777 \tmp\hive

    \tmp\hive is not local directory

    0 讨论(0)
  • 2020-11-28 05:30

    Use the latest version of "winutils.exe" and try. https://github.com/steveloughran/winutils/blob/master/hadoop-2.7.1/bin/winutils.exe

    0 讨论(0)
  • 2020-11-28 05:30

    Can please try giving 777 permission to the folder /tmp/hive because what I think is that spark runs as a anonymous user(which will come in other user category) and this permission should be recursive. I had this same issue with 1.5.1 version of spark for hive, and it worked by giving 777 permission using below command on linux

    chmod -r 777 /tmp/hive
    
    0 讨论(0)
  • 2020-11-28 05:30

    I was running spark test from IDEA, and in my case the issue was wrong winutils.exe version. I think you need to match it with you Hadoop version. You can find winutils.exe here

    0 讨论(0)
提交回复
热议问题