Timeout Exception in Apache-Spark during program Execution

后端 未结 5 2617
花落未央
花落未央 2021-02-19 16:51

I am running a Bash Script in MAC. This script calls a spark method written in Scala language for a large number of times. I am currently trying to call this spark method for

5条回答
  •  暖寄归人
    2021-02-19 17:33

    Its RpcTimeoutException .. so spark.network.timeout (spark.rpc.askTimeout) could be tuned with larger-than-default values in order to handle complex workload. You can start with these values and adjust accordingly to your workloads. Please see latest

    spark.network.timeout 120s Default timeout for all network interactions. This config will be used in place of spark.core.connection.ack.wait.timeout, spark.storage.blockManagerSlaveTimeoutMs, spark.shuffle.io.connectionTimeout, spark.rpc.askTimeout or spark.rpc.lookupTimeout if they are not configured.

    Also consider increasing executor memory i.e spark.executor.memory and most imp thing is review your code, to check whether that is candidate for further optimization.

    Solution : value 600 is based on requirement

    set by SparkConf: conf.set("spark.network.timeout", "600s")
    set by spark-defaults.conf: spark.network.timeout 600s
    set when calling spark-submit: --conf spark.network.timeout=600s
    

提交回复
热议问题