I am running Spark on Windows 7. When I use Hive, I see the following error
The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions a
Next solution worked on Windows for me:
C:\temp\hadoop\bin\winutils.exe chmod 777 \tmp\hive
\tmp\hive
is not local directory
Use the latest version of "winutils.exe" and try. https://github.com/steveloughran/winutils/blob/master/hadoop-2.7.1/bin/winutils.exe
Can please try giving 777 permission to the folder /tmp/hive because what I think is that spark runs as a anonymous user(which will come in other user category) and this permission should be recursive. I had this same issue with 1.5.1 version of spark for hive, and it worked by giving 777 permission using below command on linux
chmod -r 777 /tmp/hive
I was running spark test from IDEA, and in my case the issue was wrong winutils.exe
version. I think you need to match it with you Hadoop version. You can find winutils.exe
here