Spark-submit can't locate local file

后端 未结 1 2009
攒了一身酷
攒了一身酷 2020-12-07 00:28

I\'ve written a very simple python script for testing my spark streaming idea, and plan to run it on my local machine to mess around a little bit. Here is the command line:<

相关标签:
1条回答
  • 2020-12-07 00:30

    Supposing you want to spark-submit to YARN a Python script located at /home/user/scripts/spark_streaming.py, the correct syntax is as follows:

    spark-submit --master yarn --deploy-mode client /home/user/scripts/spark_streaming.py
    

    You can interchange the ordering of the various flags, but the script itself must be at the end; if your script accepts arguments, they should follow the script name (e.g. see this example for calculating pi with 10 decimal digits).

    For executing locally with, say, 2 cores, you should use --master local[2] - use --master local[*] for all available local cores (no deploy-mode flag in both cases).

    Check the docs for more info (although admittedly they are rather poor in pyspark demonstrations).

    PS The mention of Jupyter, as well the path shown in your error message are extremely puzzling...

    UPDATE: Seems that PYSPARK_DRIVER_PYTHON=jupyter messes up everything, funneling the execution through Jupyter (which is undesirable here, and it may explain the weird error message). Try modifying the environment variables in your .bashrc as follows:

    export SPARK_HOME="/usr/local/spark"  # do not include /bin
    export PYSPARK_PYTHON=python
    export PYSPARK_DRIVER_PYTHON=python
    export PYSPARK_DRIVER_PYTHON_OPTS=""
    

    and source .bashrc.

    0 讨论(0)
提交回复
热议问题