Log4j logs not appearing for custom log levels in Java Spark program

大兔子大兔子 提交于 2020-06-01 06:20:25

问题


I have been trying to introduce custom log levels in log4j through my Java code. I followed two approaches:

  1. The official documentation approach(https://logging.apache.org/log4j/2.x/manual/customloglevels.html) where I created a new level in one line of code and used it like this:
static final Level CUSTOM = Level.forName("CUSTOM", 350);
logger.log(CUSTOM, "Test message");
  1. I also took the help of entire custom Level classes as described by this blog

I created the Custom log level class as follows:

public class CrunchifyLog4jLevel extends Level {

/**
 * Value of CrunchifyLog4jLevel level. This value is lesser than DEBUG_INT and higher
 * than TRACE_INT}
 */
public static final int CRUNCHIFY_INT = DEBUG_INT - 10;

/**
 * Level representing my log level
 */
public static final Level CRUNCHIFY = new CrunchifyLog4jLevel(CRUNCHIFY_INT, "CRUNCHIFY", 10);

/**
 * Constructor
 */
protected CrunchifyLog4jLevel(int arg0, String arg1, int arg2) {
    super(arg0, arg1, arg2);

}

/**
 * Checks whether logArgument is "CRUNCHIFY" level. If yes then returns
 * CRUNCHIFY}, else calls CrunchifyLog4jLevel#toLevel(String, Level) passing
 * it Level#DEBUG as the defaultLevel.
 */
public static Level toLevel(String logArgument) {
    if (logArgument != null && logArgument.toUpperCase().equals("CRUNCHIFY")) {
        return CRUNCHIFY;
    }
    return (Level) toLevel(logArgument, Level.DEBUG);
}

/**
 * Checks whether val is CrunchifyLog4jLevel#CRUNCHIFY_INT. If yes then
 * returns CrunchifyLog4jLevel#CRUNCHIFY, else calls
 * CrunchifyLog4jLevel#toLevel(int, Level) passing it Level#DEBUG as the
 * defaultLevel
 * 
 */
public static Level toLevel(int val) {
    if (val == CRUNCHIFY_INT) {
        return CRUNCHIFY;
    }
    return (Level) toLevel(val, Level.DEBUG);
}

/**
 * Checks whether val is CrunchifyLog4jLevel#CRUNCHIFY_INT. If yes
 * then returns CrunchifyLog4jLevel#CRUNCHIFY, else calls Level#toLevel(int, org.apache.log4j.Level)
 * 
 */
public static Level toLevel(int val, Level defaultLevel) {
    if (val == CRUNCHIFY_INT) {
        return CRUNCHIFY;
    }
    return Level.toLevel(val, defaultLevel);
}

/**
 * Checks whether logArgument is "CRUNCHIFY" level. If yes then returns
 * CrunchifyLog4jLevel#CRUNCHIFY, else calls
 * Level#toLevel(java.lang.String, org.apache.log4j.Level)
 * 
 */
public static Level toLevel(String logArgument, Level defaultLevel) {
    if (logArgument != null && logArgument.toUpperCase().equals("CRUNCHIFY")) {
        return CRUNCHIFY;
    }
    return Level.toLevel(logArgument, defaultLevel);
}
}

I had the log4j.xml as follows:

<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE log4j:configuration SYSTEM "log4j.dtd">
<log4j:configuration xmlns:log4j="http://jakarta.apache.org/log4j/"
debug="false">

<!-- FILE Appender -->
<appender name="FILE" class="org.apache.log4j.FileAppender">
    <param name="File" value="c:/crunchify.log" />
    <param name="Append" value="false" />
    <layout class="org.apache.log4j.PatternLayout">
        <param name="ConversionPattern" value="%t %-5p %c - %m%n" />
    </layout>
</appender>

<!-- CONSOLE Appender -->
<appender name="CONSOLE" class="org.apache.log4j.ConsoleAppender">
    <layout class="org.apache.log4j.PatternLayout">
        <param name="ConversionPattern" value="%d{ISO8601} %-5p [%c{1}] %m%n" />
    </layout>
</appender>

<!-- Limit Category and Specify Priority -->
<category name="kafkaExample">
<priority value="CRUNCHIFY" class="kafkaExample.CrunchifyLog4jLevel" />
    <appender-ref ref="CONSOLE" />
</category>

<!-- Setup the Root category -->
<root>
    <appender-ref ref="CONSOLE" />
</root>
</log4j:configuration>

And I used the custom log level in my Java code thus:

logger.log(CrunchifyLog4jLevel.CRUNCHIFY, "Test message");

I am creating a Spark application where these custom logs need to be printed, and both approaches did not work when I am running the application as a Spark submit job in the server, even though the master was local. The full driver program is as follows:

public class AccumulatorDriver {

private static Logger logger = LogManager.getLogger("CRUNCHIFY");

static final Level CUSTOM = Level.forName("CUSTOM", 350);

public static void main(String[] args) {

//      SparkSession spark = SparkSession.builder().appName("documentation")
//              .master("spark://ch3dr609552.express-scripts.com:7077").getOrCreate();

    SparkSession spark = SparkSession.builder().appName("documentation")
            .master("local").getOrCreate();

    StringAccumulator heightValues = new StringAccumulator();
    spark.sparkContext().register(heightValues);

    logger.info("Inside driver");


    UserDefinedFunction udf1 = udf(new AccumulatorUDF(heightValues), DataTypes.StringType);

    spark.sqlContext().udf().register("AccumulatorUDF", udf1);

    UserDefinedFunction udf2 = udf(new AccumulatorUDF2(heightValues), DataTypes.StringType);

    spark.sqlContext().udf().register("AccumulatorUDF2", udf2);

    List<Row> list = new ArrayList<Row>();
    list.add(RowFactory.create("one"));
    list.add(RowFactory.create("two"));
    list.add(RowFactory.create("three"));
    list.add(RowFactory.create("four"));
    List<org.apache.spark.sql.types.StructField> listOfStructField = new ArrayList<org.apache.spark.sql.types.StructField>();
    listOfStructField.add(DataTypes.createStructField("test", DataTypes.StringType, true));
    StructType structType = DataTypes.createStructType(listOfStructField);
    Dataset<Row> data = spark.createDataFrame(list, structType);
    data.show();

    data = data.withColumn("Test2", callUDF("AccumulatorUDF", col("test")));
    data.show();
    System.out.println("Heightvalues value: " + heightValues.value());

            data = data.withColumn("Test3", callUDF("AccumulatorUDF2", col("test")));
            System.out.println("Heightvalues value: " + heightValues.value());

//      data.show();

    logger.log(CrunchifyLog4jLevel.CRUNCHIFY, "TEst message");
//      logger.log(CUSTOM, "Heightvalues value: " + heightValues.value());
    List<String> values = heightValues.value();

    System.out.println("Size of list: " + values.size());

}
}

However, the second approach is working when I run this from my Eclipse. The only change I have to do is in the following line:

private static Logger logger = LogManager.getLogger(AccumulatorDriver.class);

Do I have to change something in my Spark installation's log4j.properties file to get the logs to appear in console? I followed this question and changed my log4j.properties accordingly. This is my log4j.properties file in Spark:

log4j.rootLogger=INFO, Console_Appender, File_Appender

log4j.appender.Console_Appender=org.apache.log4j.ConsoleAppender
log4j.appender.Console_Appender.Threshold=INFO
log4j.appender.Console_Appender.Target=System.out
log4j.appender.Console_Appender.layout=org.apache.log4j.PatternLayout
log4j.appender.Console_Appender.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss} %-5p %c{1}:%L - %m%n

log4j.appender.File_Appender=org.apache.log4j.rolling.RollingFileAppender
log4j.appender.File_Appender.Threshold=INFO
log4j.appender.File_Appender.File=file:///opt/spark_log/spark_log.txt
log4j.appender.File_Appender.RollingPolicy=org.apache.log4j.rolling.TimeBasedRollingPolicy
log4j.appender.File_Appender.TriggeringPolicy=org.apache.log4j.rolling.SizeBasedTriggeringPolicy
log4j.appender.File_Appender.RollingPolicy.FileNamePattern=/opt/spark_log/spark_log.%d{MM-dd-yyyy}.%i.txt.gz
log4j.appender.File_Appender.RollingPolicy.ActiveFileName=/opt/spark_log/spark_log.txt
log4j.appender.File_Appender.TriggeringPolicy.MaxFileSize=1000000
log4j.appender.File_Appender.layout=org.apache.log4j.PatternLayout
log4j.appender.File_Appender.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss} %-5p %c - %m%n

log4j.logger.myLogger=INFO,File_Appender

log4j.category.kafkaExample=INFO,kafkaExample.CrunchifyLog4jLevel

Please help me in displaying the custom log level even when I am running the Spark job through Spark submit. Is it because of the level number I had given to the CUSTOM log level, 350? It's just below the INFO level of 400. But I had also tried with 550 and got the same result.

来源:https://stackoverflow.com/questions/62038779/log4j-logs-not-appearing-for-custom-log-levels-in-java-spark-program

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!