Why does starting a streaming query lead to “ExitCodeException exitCode=-1073741515”?

后端 未结 4 741
花落未央
花落未央 2020-12-09 22:54

Been trying to get used to the new structured streaming but it keeps giving me below error as soon as I start a .writeStream query.

Any idea what could

相关标签:
4条回答
  • 2020-12-09 23:00

    The problem I had occurred while converting a DataFrame to a Parquet file. It would create the directory, but fail with “ExitCodeException exitCode=-1073741515”.

    I ran Spark from Intellij 2020.2.2 x64 on Windows 10 (version 2004). I have spark-3.0.1-bin-hadoop3.2 installed in GitBash on my C drive. I downloaded the winutils from this Github repo https://github.com/cdarlint/winutils after being redirected from this repo https://github.com/steveloughran/winutils.

    I installed them in C:\winutils directory (top level), which contained one subdirectory named /bin/, which contained the winutil.exe and associated files. This top-level path was added as a Windows environment variable in System variables named HADOOP_HOME. I also have a variable for SPARK_HOME to C:\Users\name\spark-3.0.1-bin-hadoop3.2\

    I was getting this error, until I found this SO post and Moises Trelles' answer above, and also this page https://answers.microsoft.com/en-us/insider/forum/insider_wintp-insider_repair/how-do-i-fix-this-error-msvcp100dll-is-missing/c167d686-044e-44ab-8e8f-968fac9525c5?auth=1

    I have a 64-bit system, so installed both x86 and x64 versions of msvcp100.dll as recommended in the answers.microsoft.com answer. I did not reboot, but I did close Intellij and reload and upon rerunning, the correct output (parquet file) was generated. Good luck! I'm so thankful for stackoverflow/google/internet/community of helpful people.

    0 讨论(0)
  • 2020-12-09 23:06

    In my case I was using Windows 10 and had to change the Environment Variables-> User Variables

    TMP and TEMP to Custom Location in some other volume (D:\Temp or E:\Temp etc..) instead of default

    %USERPROFILE%\AppData\Local\Temp

    and also set the HADOOP_HOME

    System.setProperty( "hadoop.home.dir" , "$HADOOP_HOME\winutils-master\hadoop-2.x.x" )

    Don't forget to copy the hadoop.dll to C:\Windows\System32.

    You can download the appropriate version from this link. DOWNLOAD WINUTILS.exe

    For me hadoop-2.7.1 version fixed the issue.

    0 讨论(0)
  • 2020-12-09 23:11

    Actually, I had the same problem while running Spark unit tests on my local machine. It was caused by the failing WinUtils.exe in %HADOOP_HOME% folder:

    Input: %HADOOP_HOME%\bin\winutils.exe chmod 777 %SOME_TEMP_DIRECTORY%

    Output:

    winutils.exe - System Error
    The code execution cannot proceed because MSVCR100.dll was not found.
    Reinstalling the program may fix this problem.

    After some surfing the Internet I found out an issue on winutils project of Steve Loughran: Windows 10: winutils.exe doesn't work.
    In particular it says that installing the VC++ redistributable packages should fix the problem (and that worked in my case): How do I fix this error "msvcp100.dll is missing"

    0 讨论(0)
  • 2020-12-09 23:20

    This is a windows problem

    The program can't start because MSVCP100.dll is missing from your computer. Try reinstalling the program to fix this problem

    You will need to install the VC++ redistributable packages:

    • Download Microsoft Visual C++ 2010 Redistributable Package (x86) from Official Microsoft Download Center

    http://www.microsoft.com/en-us/download/details.aspx?id=5555

    Install vcredist_x86.exe

    • Download Microsoft Visual C++ 2010 Redistributable Package (x64) from Official Microsoft Download Center

    http://www.microsoft.com/en-us/download/details.aspx?id=14632

    Install vcredist_x64.exe

    0 讨论(0)
提交回复
热议问题