I am trying to run unit test of spark job in windows 7 64 bit. I have
HADOOP_HOME=D:/winutils
winutils path= D:/winutils/bin/winutils.exe
I r
My Hadoop environment on Windows 10:
HADOOP_HOME=C:\hadoop
Spark and Scala versions:
Spark-2.3.1 and Scala-2.11.8
Below is my spark-submit command:
spark-submit --class SparkScalaTest --master local[*] D:\spark-projects\SparkScalaTest\target\scala-2.11\sparkscalatest_2.11-0.1.jar D:\HDFS\output
Based on my Hadoop environment on Windows 10, I defined the following system properties in my Scala main class:
System.setProperty("hadoop.home.dir", "C:\\hadoop\\")
System.setProperty("hadoop.tmp.dir", "C:\\hadoop\\tmp")
Result: I am getting the same error, but my outputs are getting generated in the output path D:\HDFS\output passed in spark-submit
Hope this helps to bypass this error and get the expected result for Spark running locally on Windows.