I am trying to run unit test of spark job in windows 7 64 bit. I have
HADOOP_HOME=D:/winutils
winutils path= D:/winutils/bin/winutils.exe
I r
for python:
create an empty dir tmp\hive
import os
os.system(command=f"path to \\bin\\winutils.exe chmod -R 777 path to \\tmp\\hive")
I was facing a similar problem. I changed the permission to \tmp folder instead of \tmp\hive
D:>winutils/bin/winutils.exe chmod 777 \tmp
Not seeing any error after this and there is a clean exit
I create a directory d:\spark\temp
I give total control to Everybody on this dir
I run
set TEMP=d:\spark\temp
then I submit my jar to spark and watch the directory on the explorer.
Many files and directories are created/deleted but for one of them there is an exception.
Imho this is not a right problem.
java.io.IOException: Failed to delete: D:\data\temp\spark\spark-9cc5a3ad-7d79-4317-8990-f278e63cb40b\userFiles-4c442ed7-83ba-4724-a533-5f171d830913\simple-app_2.11-1.0.jar
this is when trying to delete the submitted package. It may not have been released by all involved process.
Running Spark in windows has this deleting Spark temp issue. You can set it as follows to hide it.
Logger.getLogger("org").setLevel(Level.FATAL)