I am running Spark on Windows 7. When I use Hive, I see the following error
The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions a
I also faced this issue. This issue is related to network. I installed spark on Windows 7 using particular domain.
Domain name can be checked
Start -> computer -> Right click -> Properties -> Computer name, domain and workgroup settings -> click on change -> Computer Name (Tab) -> Click on Change -> Domain name.
When I run spark-shell command, it works fine, without any error.
In other networks I received write permission error. To avoid this error, run spark command on Domain specified in above path.
First of all, make sure you are using correct Winutils for your OS. Then next step is permissions.
On Windows, you need to run following command on cmd:
D:\winutils\bin\winutils.exe chmod 777 D:\tmp\hive
Hope you have downloaded winutils already and set the HADOOP_HOME variable.
Issue resolved in spark version 2.0.2 (Nov 14 2016). Use this version . Version 2.1.0 Dec 28 2016 release has same issues.
Using the correct version of winutils.exe did the trick for me. The winutils should be from the version of Hadoop that Spark has been pre built for.
Set HADOOP_HOME environment variable to the bin location of winutils.exe. I have stored winutils.exe along with C:\Spark\bin files. So now my SPARK_HOME and HADOOP_HOME point to the same location C:\Spark
.
Now that winultils has been added to path, give permissions for hive folder using winutils.exe chmod 777 C:\tmp\hive
The main reason is you started the spark at wrong directory. please create folders in D://tmp/hive (give full permissions) and start your spark in D: drive D:> spark-shell
now it will work.. :)
You need to set this directory's permissions on HDFS, not your local filesystem. /tmp
doesn't mean C:\tmp
unless you set fs.defaultFs
in core-site.xml to file://c:/
, which is probably a bad idea.
Check it using
hdfs dfs -ls /tmp
Set it using
hdfs dfs -chmod 777 /tmp/hive