Pyspark Error: java.net.BindException: Can't assign requested address: Service 'sparkDriver' failed after 16 retries

后端 未结 0 886
借酒劲吻你
借酒劲吻你 2021-01-16 11:02

I like to test some simple pyspark logic locally before running it on AWS. I have a script that works fine until today:

from pyspark.sql import SparkSession

         


        
相关标签:
回答
  • 消灭零回复
提交回复
热议问题