How to prevent log bombing using logback?

前端 未结 3 1921
误落风尘
误落风尘 2021-02-09 12:00

I\'m not sure the term \"bombing\" is the right one. By bombing, I mean a log happening many times with the same content (message and args).

For example, a denial of ser

相关标签:
3条回答
  • 2021-02-09 12:43

    I created a generic DuplicateActionFilterByInsertTime class with a small implementation for logging.

    Example usage:

    DuplicateLogFilter logSilencer = new DuplicateLogFilter(10000);// 10 secs filter
    Logger logger = Logger.getLogger(DuplicateActionFilterByInsertTimeTest.class.getName());
    
    logSilencer.log(new LoggerMessage(logger, Level.INFO, "Hello {0}", new Object[]{"Alik"})));
    

    DuplicateLogFilter:

    class DuplicateLogFilter extends DuplicateActionFilterByInsertTime<LoggerMessage> {
        DuplicateLogFilter(int filterMillis) {
            super(filterMillis);
            addListener(new DuplicateActionFilterByInsertTime.Listener<LoggerMessage>() {
                @Override
                public void onFilteringFinished(FilteredItem<LoggerMessage> filteredItem) {
                    filteredItem.getItem().getLogger().log(Level.INFO, filteredItem.getItem().getMessage() + ". Filtered. Overall " + filteredItem.getSilenceInfo().getCount() + " messages", filteredItem.getItem().getParams());
                }
    
                @Override
                public void onFilteringStarted(LoggerMessage loggerMessage) {
                    loggerMessage.getLogger().log(Level.INFO, loggerMessage.getMessage() + ". Filtering duplicate logs...", loggerMessage.getParams());
                }
            });
    
        }
        void log(LoggerMessage loggerMessage) {
            run(loggerMessage);
        }
    }
    

    Base class that actually does all the filtering

    (`DuplicateActionFilterByInsertTime`:
    public class DuplicateActionFilterByInsertTime<E extends Runnable> {
    
        private static final Logger LOGGER = Logger.getLogger(DuplicateActionFilterByInsertTime.class.getName());
    
        private final long filterMillis;
    
        private final ConcurrentHashMap<E, FilterInfoImpl> actionMap = new ConcurrentHashMap<>();
    
        private final ConcurrentLinkedQueue<E> actionQueue = new ConcurrentLinkedQueue<>();
    
        private final ScheduledExecutorService scheduledExecutorService = Executors.newSingleThreadScheduledExecutor();
    
        private final AtomicBoolean purgerRegistered = new AtomicBoolean(false);
    
        private final Set<Listener<E>> listeners = ConcurrentHashMap.newKeySet();
    
        public DuplicateActionFilterByInsertTime(int filterMillis) {
            this.filterMillis = filterMillis;
        }
    
        public FilterInfo get(E e) {
            FilterInfoImpl insertionData = actionMap.get(e);
            if (insertionData == null || insertionData.isExpired(filterMillis)) {
                return null;
            }
            return insertionData;
        }
    
        public boolean run(E e) {
            actionMap.computeIfPresent(e, (e1, insertionData) -> {
                int count = insertionData.incrementAndGet();
                if (count == 2) {
                    notifyFilteringStarted(e1);
                }
                return insertionData;
            });
            boolean isNew = actionMap.computeIfAbsent(e, e1 -> {
                FilterInfoImpl insertionData = new FilterInfoImpl();
                actionQueue.add(e1);
                return insertionData;
            }).getCount() == 1;
    
            tryRegisterPurger();
    
            if (isNew) {
                e.run();
            }
            return isNew;
        }
    
        private void tryRegisterPurger() {
            if (actionMap.size() != 0 && purgerRegistered.compareAndSet(false, true)) {
                scheduledExecutorService.schedule(() -> {
                    try {
                        for (Iterator<E> iterator = actionQueue.iterator(); iterator.hasNext(); ) {
                            E e = iterator.next();
                            FilterInfoImpl insertionData = actionMap.get(e);
                            if (insertionData == null || insertionData.isExpired(filterMillis)) {
                                iterator.remove();
                            }
                            if (insertionData != null && insertionData.isExpired(filterMillis)) {
                                FilterInfoImpl removed = actionMap.remove(e);
                                FilteredItem<E> filteredItem = new FilteredItem<>(e, removed);
                                notifyFilteringFinished(filteredItem);
                            } else {
                                // All the elements that were left shouldn't be purged.
                                break;
                            }
                        }
                    } finally {
                        purgerRegistered.set(false);
                        tryRegisterPurger();
                    }
                }, filterMillis, TimeUnit.MILLISECONDS);
            }
        }
    
        private void notifyFilteringFinished(FilteredItem<E> filteredItem) {
            new Thread(() -> listeners.forEach(l -> {
                try {
                    l.onFilteringFinished(filteredItem);
                } catch (Exception e) {
                    LOGGER.log(Level.WARNING, "Purge notification failed. Continuing to next one (if exists)", e);
                }
            })).start();
        }
    
        private void notifyFilteringStarted(final E e) {
            new Thread(() -> listeners.forEach(l -> {
                try {
                    l.onFilteringStarted(e);
                } catch (Exception e1) {
                    LOGGER.log(Level.WARNING, "Filtering started notification failed. Continuing to next one (if exists)", e1);
                }
            })).start();
        }
    
        public void addListener(Listener<E> listener) {
            listeners.add(listener);
        }
    
        public void removeLister(Listener<E> listener) {
            listeners.remove(listener);
        }
    
        public interface FilterInfo {
            long getInsertTimeMillis();
    
            int getCount();
        }
    
        public interface Listener<E> {
            void onFilteringStarted(E e);
            void onFilteringFinished(FilteredItem<E> filteredItem);
        }
    
        private static class FilterInfoImpl implements FilterInfo {
            private final long insertTimeMillis = System.currentTimeMillis();
            private AtomicInteger count = new AtomicInteger(1);
    
            int incrementAndGet() {
                return count.incrementAndGet();
            }
    
            @Override
            public long getInsertTimeMillis() {
                return insertTimeMillis;
            }
    
            @Override
            public int getCount() {
                return count.get();
            }
    
            boolean isExpired(long expirationMillis) {
                return insertTimeMillis + expirationMillis < System.currentTimeMillis();
            }
        }
    
        public static class FilteredItem<E> {
            private final E item;
            private final FilterInfo filterInfo;
    
            FilteredItem(E item, FilterInfo filterInfo) {
                this.item = item;
                this.filterInfo = filterInfo;
            }
    
            public E getItem() {
                return item;
            }
    
            public FilterInfo getFilterInfo() {
                return filterInfo;
            }
        }
    }
    

    Source.

    0 讨论(0)
  • 2021-02-09 12:48

    I think I would prefer to solve the underlying issue that causes the logs to blow, rather then silencing the indication for the issue.

    You can configure a separate logger for the class which produces the abusing message and roll it independently of the main log.

      <!-- Turn on debug logging for our HikariCP connection pool. -->
      <logger name="com.zaxxer.hikari" level="DEBUG" />
    
      <!-- Turn on debug logging for all loggers under com.stubbornjava -->
      <logger name="com.stubbornjava" level="DEBUG" />
    

    https://www.stubbornjava.com/posts/logging-in-java-with-slf4j-and-logback

    0 讨论(0)
  • 2021-02-09 12:51

    The DuplicateMessageFilter, which filters out exact duplicate messages, might fit your first requirement. Currently, the filter stops all duplicate messages after a specified threshold (via allowedRepititions setting), which might not be desirable. You would have to extend the filter if you preferred to have the duplicate-count to reset on new messages or based on time. It does not provide a summary of silenced logs, however.

    Example logback config:

    <configuration>
    
      <turboFilter class="ch.qos.logback.classic.turbo.DuplicateMessageFilter" allowedRepetitions="2"/>
    
      <appender name="console" class="ch.qos.logback.core.ConsoleAppender">
        <encoder>
          <pattern>%date [%thread] %-5level %logger - %msg%n</pattern>
        </encoder>
      </appender>
    
      <root level="INFO">
        <appender-ref ref="console" />
      </root>  
    </configuration>
    
    0 讨论(0)
提交回复
热议问题