Exception while deleting Spark temp dir in Windows 7 64 bit

后端 未结 10 953
走了就别回头了
走了就别回头了 2021-02-12 23:43

I am trying to run unit test of spark job in windows 7 64 bit. I have

HADOOP_HOME=D:/winutils

winutils path= D:/winutils/bin/winutils.exe

I r

10条回答
  •  广开言路
    2021-02-13 00:24

    After following above suggestions, I made below changes -

    Update spark-defaults.conf or create a copy of spark-defaults.conf.template
    & rename it to spark-defaults.conf

    Add following line like - spark.local.dir=E:\spark2.4.6\tempDir via above line we are setting the temp folder for Spark to use.

    Similarly update log4j.properties in your spark setup like did above, with the below lines-

    log4j.logger.org.apache.spark.util.ShutdownHookManager=OFF log4j.logger.org.apache.spark.SparkEnv=ERROR

    Now ShutdownHookManager will not be used during exit causing those error lines on console.

    Now how to clean the temp folder then?
    So for that add below lines in bin/spark-shell.cmd file -

    rmdir /q /s "E:/spark2.4.6/tempDir"
    del C:\Users\nitin\AppData\Local\Temp\jansi*.*

    By having above updates, I can see clean exit with temp folders clean-up also.

提交回复
热议问题