Exception while deleting Spark temp dir in Windows 7 64 bit

后端 未结 10 977
走了就别回头了
走了就别回头了 2021-02-12 23:43

I am trying to run unit test of spark job in windows 7 64 bit. I have

HADOOP_HOME=D:/winutils

winutils path= D:/winutils/bin/winutils.exe

I r

10条回答
  •  忘了有多久
    2021-02-13 00:26

    My Hadoop environment on Windows 10:

    HADOOP_HOME=C:\hadoop
    

    Spark and Scala versions:

    Spark-2.3.1 and Scala-2.11.8
    

    Below is my spark-submit command:

    spark-submit --class SparkScalaTest --master local[*] D:\spark-projects\SparkScalaTest\target\scala-2.11\sparkscalatest_2.11-0.1.jar D:\HDFS\output
    

    Based on my Hadoop environment on Windows 10, I defined the following system properties in my Scala main class:

    System.setProperty("hadoop.home.dir", "C:\\hadoop\\")
    System.setProperty("hadoop.tmp.dir", "C:\\hadoop\\tmp")
    

    Result: I am getting the same error, but my outputs are getting generated in the output path D:\HDFS\output passed in spark-submit

    Hope this helps to bypass this error and get the expected result for Spark running locally on Windows.

提交回复
热议问题