I\'m working on a Scala project in IntelliJ that was created through SBT. The project has Spark as one of its dependencies. I\'m still in the development phase so everything is
Put your log4j.properties file under a directory marked as resources, spark will read this log4j configuration.
For shutting down/setting log level programmatically in spark 2.0+
Logger.getLogger("org.apache.spark").setLevel(Level.OFF);
Spark by default logs almost everything you would like to see in the logs, however, if you need to change the logging behaviour, you can edit log4j.properties in the conf directory of your Apache Spark configuration. If you're using a prebuilt version, you can find it in /home/coco/Applications/spark-1.4.0-bin-hadoop2.6/conf directory. There is a template file "log4j.properties.template" that you have to copy to "log4j.properties" and edit it depending on your needs. I hope it helps.
Setting the log level on the SparkContext worked for me under Eclipse
spark.sparkContext.setLogLevel("WARN")
If you are working on the local development with IDE, you can change the log level at run-time by:
LogManager.getRootLogger.setLevel(Level.ALL)
Ps: Put that line after the SparkContext/ SQLContext was created in your code.
I would love to figure out how to do this with a project local properties file (an example file would be nice), but I was able to get this done in Spark 2.2 with the following code:
import org.apache.log4j.{Level, Logger}
object MySparkApp {
def main(args: Array[String]): Unit = {
Logger.getLogger("org.apache.spark").setLevel(Level.WARN)