Exception while deleting Spark temp dir in Windows 7 64 bit

后端 未结 10 946
走了就别回头了
走了就别回头了 2021-02-12 23:43

I am trying to run unit test of spark job in windows 7 64 bit. I have

HADOOP_HOME=D:/winutils

winutils path= D:/winutils/bin/winutils.exe

I r

10条回答
  •  甜味超标
    2021-02-13 00:36

    I've set the HADOOP_HOME variable in the same way as you have. (On Windows 10)

    Try using the complete path when setting permissions i.e.

    D:> winutils/bin/winutils.exe chmod 777 \tmp\hive

    This worked for me.

    Also, just a note on the exception - I'm getting the same exception on exiting spark from cmd by running "sys.exit".

    But... I can exit cleanly when I use ":q" or ":quit". So, not sure what's happening here, still trying to figure out...

提交回复
热议问题