I am trying to run unit test of spark job in windows 7 64 bit. I have
HADOOP_HOME=D:/winutils
winutils path= D:/winutils/bin/winutils.exe
I r
After following above suggestions, I made below changes -
Update spark-defaults.conf or create a copy of spark-defaults.conf.template
& rename it to spark-defaults.conf
Add following line like - spark.local.dir=E:\spark2.4.6\tempDir via above line we are setting the temp folder for Spark to use.
Similarly update log4j.properties in your spark setup like did above, with the below lines-
log4j.logger.org.apache.spark.util.ShutdownHookManager=OFF log4j.logger.org.apache.spark.SparkEnv=ERROR
Now ShutdownHookManager will not be used during exit causing those error lines on console.
Now how to clean the temp folder then?
So for that add below lines in bin/spark-shell.cmd file -
rmdir /q /s "E:/spark2.4.6/tempDir"
del C:\Users\nitin\AppData\Local\Temp\jansi*.*
By having above updates, I can see clean exit with temp folders clean-up also.