Exception while deleting Spark temp dir in Windows 7 64 bit

后端 未结 10 948
走了就别回头了
走了就别回头了 2021-02-12 23:43

I am trying to run unit test of spark job in windows 7 64 bit. I have

HADOOP_HOME=D:/winutils

winutils path= D:/winutils/bin/winutils.exe

I r

相关标签:
10条回答
  • 2021-02-13 00:37

    for python:

    create an empty dir tmp\hive

    import os
    os.system(command=f"path to \\bin\\winutils.exe chmod -R 777 path to \\tmp\\hive")
    
    0 讨论(0)
  • 2021-02-13 00:38

    I was facing a similar problem. I changed the permission to \tmp folder instead of \tmp\hive

    D:>winutils/bin/winutils.exe chmod 777 \tmp

    Not seeing any error after this and there is a clean exit

    0 讨论(0)
  • 2021-02-13 00:45

    I create a directory d:\spark\temp

    I give total control to Everybody on this dir

    I run

    set TEMP=d:\spark\temp
    

    then I submit my jar to spark and watch the directory on the explorer.

    Many files and directories are created/deleted but for one of them there is an exception.

    Imho this is not a right problem.

    java.io.IOException: Failed to delete: D:\data\temp\spark\spark-9cc5a3ad-7d79-4317-8990-f278e63cb40b\userFiles-4c442ed7-83ba-4724-a533-5f171d830913\simple-app_2.11-1.0.jar

    this is when trying to delete the submitted package. It may not have been released by all involved process.

    0 讨论(0)
  • 2021-02-13 00:46

    Running Spark in windows has this deleting Spark temp issue. You can set it as follows to hide it.

    Logger.getLogger("org").setLevel(Level.FATAL)

    0 讨论(0)
提交回复
热议问题