The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rw-rw-rw- (on Windows)

后端 未结 16 1039
星月不相逢
星月不相逢 2020-11-28 04:45

I am running Spark on Windows 7. When I use Hive, I see the following error

The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions a         


        
相关标签:
16条回答
  • 2020-11-28 05:06

    I also faced this issue. This issue is related to network. I installed spark on Windows 7 using particular domain.

    Domain name can be checked

    Start -> computer -> Right click -> Properties -> Computer name, domain and workgroup settings -> click on change -> Computer Name (Tab) -> Click on Change -> Domain name.

    When I run spark-shell command, it works fine, without any error.

    In other networks I received write permission error. To avoid this error, run spark command on Domain specified in above path.

    0 讨论(0)
  • 2020-11-28 05:08

    First of all, make sure you are using correct Winutils for your OS. Then next step is permissions.
    On Windows, you need to run following command on cmd:

    D:\winutils\bin\winutils.exe chmod 777 D:\tmp\hive
    

    Hope you have downloaded winutils already and set the HADOOP_HOME variable.

    0 讨论(0)
  • 2020-11-28 05:08

    Issue resolved in spark version 2.0.2 (Nov 14 2016). Use this version . Version 2.1.0 Dec 28 2016 release has same issues.

    0 讨论(0)
  • 2020-11-28 05:09

    Using the correct version of winutils.exe did the trick for me. The winutils should be from the version of Hadoop that Spark has been pre built for.

    Set HADOOP_HOME environment variable to the bin location of winutils.exe. I have stored winutils.exe along with C:\Spark\bin files. So now my SPARK_HOME and HADOOP_HOME point to the same location C:\Spark.

    Now that winultils has been added to path, give permissions for hive folder using winutils.exe chmod 777 C:\tmp\hive

    0 讨论(0)
  • 2020-11-28 05:11

    The main reason is you started the spark at wrong directory. please create folders in D://tmp/hive (give full permissions) and start your spark in D: drive D:> spark-shell

    now it will work.. :)

    0 讨论(0)
  • 2020-11-28 05:12

    You need to set this directory's permissions on HDFS, not your local filesystem. /tmp doesn't mean C:\tmp unless you set fs.defaultFs in core-site.xml to file://c:/, which is probably a bad idea.

    Check it using

    hdfs dfs -ls /tmp 
    

    Set it using

    hdfs dfs -chmod 777 /tmp/hive
    
    0 讨论(0)
提交回复
热议问题