I\'d like to stop various messages that are coming on spark shell.
I tried to edit the log4j.properties
file in order to stop these message.
Her
sparkContext.setLogLevel("OFF")
This one worked for me.
For only ERROR messages to be displayed as stdout
, log4j.properties
file may look like:
# Root logger option
log4j.rootLogger=ERROR, stdout
# Direct log messages to stdout
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.Target=System.out
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L - %m%n
NOTE: Put
log4j.properties
file insrc/main/resources
folder to be effective. And iflog4j.properties
doesn't exist (meaningspark
is usinglog4j-defaults.properties
file) then you can create it by going toSPARK_HOME/conf
and thenmv log4j.properties.template log4j.properties
and then proceed with above said changes.
Actually, there are many ways to do it. Some are harder from others, but it is up to you which one suits you best. I will try to showcase them all.
Seems to be the easiest, but you will need to recompile your app to change those settings. Personally, I don't like it but it works fine.
import org.apache.log4j.{Level, Logger}
val rootLogger = Logger.getRootLogger()
rootLogger.setLevel(Level.ERROR)
Logger.getLogger("org.apache.spark").setLevel(Level.WARN)
Logger.getLogger("org.spark-project").setLevel(Level.WARN)
You can achieve much more just using log4j
API.
Source: [Log4J Configuration Docs, Configuration section]
log4j.properties
during spark-submit
This one is very tricky, but not impossible. And my favorite.
Log4J during app startup is always looking for and loading log4j.properties
file from classpath.
However, when using spark-submit
Spark Cluster's classpath has precedence over app's classpath! This is why putting this file in your fat-jar will not override the cluster's settings!
Add
-Dlog4j.configuration=<location of configuration file>
tospark.driver.extraJavaOptions
(for the driver) or
spark.executor.extraJavaOptions
(for executors).Note that if using a file, the
file:
protocol should be explicitly provided, and the file needs to exist locally on all the nodes.
To satisfy the last condition, you can either upload the file to the location available for the nodes (like hdfs
) or access it locally with driver if using deploy-mode client
. Otherwise:
upload a custom
log4j.properties
using spark-submit, by adding it to the--files
list of files to be uploaded with the application.
Source: Spark docs, Debugging
Example log4j.properties
:
# Blacklist all to warn level
log4j.rootCategory=WARN, console
log4j.appender.console=org.apache.log4j.ConsoleAppender
log4j.appender.console.target=System.err
log4j.appender.console.layout=org.apache.log4j.PatternLayout
log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n
# Whitelist our app to info :)
log4j.logger.com.github.atais=INFO
Executing spark-submit
, for cluster mode:
spark-submit \
--master yarn \
--deploy-mode cluster \
--conf "spark.driver.extraJavaOptions=-Dlog4j.configuration=file:log4j.properties" \
--conf "spark.executor.extraJavaOptions=-Dlog4j.configuration=file:log4j.properties" \
--files "/absolute/path/to/your/log4j.properties" \
--class com.github.atais.Main \
"SparkApp.jar"
Note that you must use --driver-java-options
if using client
mode. Spark docs, Runtime env
Executing spark-submit
, for client mode:
spark-submit \
--master yarn \
--deploy-mode client \
--driver-java-options "-Dlog4j.configuration=file:/absolute/path/to/your/log4j.properties \
--conf "spark.executor.extraJavaOptions=-Dlog4j.configuration=file:log4j.properties" \
--files "/absolute/path/to/your/log4j.properties" \
--class com.github.atais.Main \
"SparkApp.jar"
spark-cluster
with --files
will be available at root dir, so there is no need to add any path in file:log4j.properties
. --files
must be provided with absolute path!file:
prefix in configuration URI is mandatory.conf/log4j.properties
This changes global logging configuration file.
update the
$SPARK_CONF_DIR/log4j.properties
file and it will be automatically uploaded along with the other configurations.
Source: Spark docs, Debugging
To find your SPARK_CONF_DIR
you can use spark-shell
:
atais@cluster:~$ spark-shell
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 2.1.1
/_/
scala> System.getenv("SPARK_CONF_DIR")
res0: String = /var/lib/spark/latest/conf
Now just edit /var/lib/spark/latest/conf/log4j.properties
(with example from method #2) and all your apps will share this configuration.
If you like the solution #3, but want to customize it per application, you can actually copy conf
folder, edit it contents and specify as the root configuration during spark-submit
.
To specify a different configuration directory other than the default
“SPARK_HOME/conf”
, you can setSPARK_CONF_DIR
. Spark will use the configuration files (spark-defaults.conf
,spark-env.sh
,log4j.properties
, etc) from this directory.
Source: Spark docs, Configuration
conf
folder (more info, method #3)log4j.properties
in that folder (example in method #2)Set SPARK_CONF_DIR
to this folder, before executing spark-submit
,
example:
export SPARK_CONF_DIR=/absolute/path/to/custom/conf
spark-submit \
--master yarn \
--deploy-mode cluster \
--class com.github.atais.Main \
"SparkApp.jar"
I am not sure if there is any other method, but I hope this covers the topic from A to Z. If not, feel free to ping me in the comments!
Enjoy your way!
Edit your conf/log4j.properties
file and change the following line:
log4j.rootCategory=INFO, console
to
log4j.rootCategory=ERROR, console
Another approach would be to :
Start spark-shell and type in the following:
import org.apache.log4j.Logger
import org.apache.log4j.Level
Logger.getLogger("org").setLevel(Level.OFF)
Logger.getLogger("akka").setLevel(Level.OFF)
You won't see any logs after that.
Other options for Level include: all
, debug
, error
, fatal
, info
, off
, trace
, trace_int
, warn
Details about each can be found in the documentation.
For Spark Context you may use:
sc.setLogLevel(<logLevel>)
where
loglevel
can be ALL, DEBUG, ERROR, FATAL, INFO, OFF, TRACE or WARN.
Internally, setLogLevel
calls org.apache.log4j.Level.toLevel(logLevel)
that it then uses to set using org.apache.log4j.LogManager.getRootLogger().setLevel(level)
.
You may directly set the logging levels to
OFF
using:LogManager.getLogger("org").setLevel(Level.OFF)
You can set up the default logging for Spark shell in conf/log4j.properties
. Use conf/log4j.properties.template
as a starting point.
In standalone Spark applications or while in Spark Shell session, use the following:
import org.apache.log4j.{Level, Logger}
Logger.getLogger(classOf[RackResolver]).getLevel
Logger.getLogger("org").setLevel(Level.OFF)
Logger.getLogger("akka").setLevel(Level.OFF)
Use the following in conf/log4j.properties
to disable logging completely:
log4j.logger.org=OFF
Reference: Mastering Spark by Jacek Laskowski.
Thanks @AkhlD and @Sachin Janani for suggesting changes in .conf
file.
Following code solved my issue:
1) Added import org.apache.log4j.{Level, Logger}
in import section
2) Added following line after creation of spark context object i.e. after val sc = new SparkContext(conf)
:
val rootLogger = Logger.getRootLogger()
rootLogger.setLevel(Level.ERROR)