I am trying to run unit test of spark job in windows 7 64 bit. I have
HADOOP_HOME=D:/winutils
winutils path= D:/winutils/bin/winutils.exe
I r
I have a workaround for this, instead of letting spark's ShutdownHookManager
to delete the temporary directories you can issue windows commands to do that,
Steps:
Change the temp directory using spark.local.dir
in spark-defaults.conf
file
Set log4j.logger.org.apache.spark.util.ShutdownHookManager=OFF
in log4j.properties
file
spark-shell
internally calls spark-shell.cmd
file. So add rmdir /q /s "your_dir\tmp"
this should work!