Spark Error: invalid log directory /app/spark/spark-1.6.1-bin-hadoop2.6/work/app-20161018015113-0000/3/

放肆的年华 提交于 2019-12-23 23:23:39

问题


My spark application is failing with the above error.

Actually my spark program is writing the logs to that directory. Both stderr and stdout are being written to all the workers.

My program use to worik fine earlier. But yesterday i changed the fodler pointed to SPARK_WORKER_DIR. But today i put the old setting back and restarted the spark.

Can anyone give me clue on why i am getting this error?


回答1:


In my case the problem was caused by the activation of SPARK_WORKER_OPTS="-Dspark.worker.cleanup.enabled=true

in spark-env.sh, that should remove old app/driver data directories, but it seems it is bugged and removes data of running apps.

Just comment that line and see if it helps.



来源:https://stackoverflow.com/questions/40098710/spark-error-invalid-log-directory-app-spark-spark-1-6-1-bin-hadoop2-6-work-app

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!