Can I run dataproc jobs in cluster mode

泄露秘密 提交于 2020-07-19 06:45:27

问题


Just starting to get familiar with GCP dataproc. I've noticed when I use gcloud dataproc jobs submit pyspark that jobs are submitted with spark.submit.deployMode=client. Is spark.submit.deployMode=cluster an option for us?


回答1:


Yes, you can, by specifying --properties spark.submit.deployMode=cluster. Just note that driver output will be in yarn userlogs (you can access them in Stackdriver Logging from the Console). We run in client mode by default to stream driver output to you.



来源:https://stackoverflow.com/questions/49678757/can-i-run-dataproc-jobs-in-cluster-mode

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!