Apache spark: setting spark.eventLog.enabled and spark.eventLog.dir at submit or Spark start

前端 未结 2 1302
遇见更好的自我
遇见更好的自我 2021-02-01 07:47

I would like to set spark.eventLog.enabled and spark.eventLog.dir at the spark-submit or start-all level -- not require it to

相关标签:
2条回答
  • 2021-02-01 08:06

    I solved the problem, yet strangely I had tried this before... All the same, now it seems like a stable solution:

    Create a directory in HDFS for logging, say /eventLogging

    hdfs dfs -mkdir /eventLogging
    

    Then spark-shell or spark-submit (or whatever) can be run with the following options:

    --conf spark.eventLog.enabled=true --conf spark.eventLog.dir=hdfs://<hdfsNameNodeAddress>:8020/eventLogging
    

    such as:

    spark-shell --conf spark.eventLog.enabled=true --conf spark.eventLog.dir=hdfs://<hdfsNameNodeAddress>:8020/eventLogging
    
    0 讨论(0)
  • 2021-02-01 08:27

    Create a local directory:

    $ mkdir /tmp/spark-events
    

    Run Spark-shell with --conf spark.eventLog.enabled

    $ spark-shell --conf spark.eventLog.enabled --class com.MainClass --packages packages_if_any --master local[4] app.jar
    
    0 讨论(0)
提交回复
热议问题