问题
My spark application is failing with the above error.
Actually my spark program is writing the logs to that directory. Both stderr and stdout are being written to all the workers.
My program use to worik fine earlier. But yesterday i changed the fodler pointed to SPARK_WORKER_DIR. But today i put the old setting back and restarted the spark.
Can anyone give me clue on why i am getting this error?
回答1:
In my case the problem was caused by the activation of SPARK_WORKER_OPTS="-Dspark.worker.cleanup.enabled=true
in spark-env.sh, that should remove old app/driver data directories, but it seems it is bugged and removes data of running apps.
Just comment that line and see if it helps.
来源:https://stackoverflow.com/questions/40098710/spark-error-invalid-log-directory-app-spark-spark-1-6-1-bin-hadoop2-6-work-app