I am trying to run unit test of spark job in windows 7 64 bit. I have
HADOOP_HOME=D:/winutils
winutils path= D:/winutils/bin/winutils.exe
I r
I've set the HADOOP_HOME variable in the same way as you have. (On Windows 10)
Try using the complete path when setting permissions i.e.
D:> winutils/bin/winutils.exe chmod 777 \tmp\hive
This worked for me.
Also, just a note on the exception - I'm getting the same exception on exiting spark from cmd by running "sys.exit".
But... I can exit cleanly when I use ":q" or ":quit". So, not sure what's happening here, still trying to figure out...