Spark Error: invalid log directory /app/spark/spark-1.6.1-bin-hadoop2.6/work/app-20161018015113-0000/3/
问题 My spark application is failing with the above error. Actually my spark program is writing the logs to that directory. Both stderr and stdout are being written to all the workers. My program use to worik fine earlier. But yesterday i changed the fodler pointed to SPARK_WORKER_DIR. But today i put the old setting back and restarted the spark. Can anyone give me clue on why i am getting this error? 回答1: In my case the problem was caused by the activation of SPARK_WORKER_OPTS="-Dspark.worker