Exception while deleting Spark temp dir in Windows 7 64 bit

后端 未结 10 979
走了就别回头了
走了就别回头了 2021-02-12 23:43

I am trying to run unit test of spark job in windows 7 64 bit. I have

HADOOP_HOME=D:/winutils

winutils path= D:/winutils/bin/winutils.exe

I r

10条回答
  •  难免孤独
    2021-02-13 00:28

    I have a workaround for this, instead of letting spark's ShutdownHookManager to delete the temporary directories you can issue windows commands to do that,

    Steps:

    1. Change the temp directory using spark.local.dir in spark-defaults.conf file

    2. Set log4j.logger.org.apache.spark.util.ShutdownHookManager=OFF in log4j.properties file

    3. spark-shell internally calls spark-shell.cmd file. So add rmdir /q /s "your_dir\tmp"

    this should work!

提交回复
热议问题