Configurable max value length for key-value pairs and MDC
ntngel1 opened this issue · 6 comments
Is your feature request related to a problem? Please describe.
Recently we started to face an issue with pushing too much data in key-value pairs and MDC, one of possible examples is huge request body of some request we're trying to log. ElasticSearch has some configured max length and we need to respect it.
Describe the solution you'd like
Add new properties for configuring max value length for key-value pairs and MDC.
Some possible options are:
maxKeyValueLength
maxMdcLenth
maxMdcValueLenth
Describe alternatives you've considered
I can trim key-value pairs somewhere else, for example in my app make some class for handling mapped diagnostic context, which will handle trimming for long values.
I'd like to try to implement it if you give me green light.
Thanks for bringing this up. It seems this is a long-standing issue within Graylog itself – see Graylog2/graylog2-server#873.
As the GELF protocol does not contain any restrictions on the length of key-value pairs, I don't want to add trimming functionality to logback-gelf. I consider 32K values as a rare edge case, and I would recommend to handle this in your application.
You could create a custom MDC mapper:
public class CustomMdcDataFieldMapper implements GelfFieldMapper<String> {
private static final int MAX_LENGTH = 20_000;
@Override
public void mapField(ILoggingEvent event, BiConsumer<String, String> valueHandler) {
Map<String, String> mdcPropertyMap = event.getMDCPropertyMap();
if (mdcPropertyMap != null) {
mdcPropertyMap.forEach((k, v) -> valueHandler.accept(k, trimToLength(v)));
}
}
private static String trimToLength(String value) {
return value.length() > MAX_LENGTH ? value.substring(0, MAX_LENGTH) : value;
}
}
And then use it in your logback configuration:
<configuration>
<appender name="GELF" class="de.siegmar.logbackgelf.GelfUdpAppender">
<graylogHost>localhost</graylogHost>
<graylogPort>12201</graylogPort>
<encoder class="de.siegmar.logbackgelf.GelfEncoder">
<includeMdcData>false</includeMdcData>
<fieldMapper class="your.package.CustomMdcDataFieldMapper"/>
</encoder>
</appender>
<root level="debug">
<appender-ref ref="GELF" />
</root>
</configuration>
I didn't have a chance to test this, but it should work.
Thank you for solution, didn't even know that it's possible to write such custom field mapper that will handle such cases. Definitely going to use it.
Well, I tried this and can't really make it work. I have following config:
<?xml version="1.0" encoding="UTF-8"?>
<!-- https://www.playframework.com/documentation/latest/SettingsLogger -->
<configuration>
<conversionRule conversionWord="coloredLevel" converterClass="play.api.libs.logback.ColoredLevel" />
<include resource="shared/console.xml"/>
<appender name="GELF" class="de.siegmar.logbackgelf.GelfUdpAppender">
<graylogHost>${graylog.server}</graylogHost>
<graylogPort>${graylog.port}</graylogPort>
<encoder class="de.siegmar.logbackgelf.GelfEncoder">
<originHost>${graylog.source}</originHost>
<includeRawMessage>false</includeRawMessage>
<includeMarker>true</includeMarker>
<includeCallerData>false</includeCallerData>
<includeRootCauseData>false</includeRootCauseData>
<includeLevelName>false</includeLevelName>
<includeKeyValues>false</includeKeyValues>
<fieldMapper class="utils.logging.TestMapper"/>
<shortMessageLayout class="ch.qos.logback.classic.PatternLayout">
<pattern>%.-${graylog.maxMessageLength}m%nopex</pattern>
</shortMessageLayout>
<fullMessageLayout class="ch.qos.logback.classic.PatternLayout">
<pattern>%.-${graylog.maxMessageLength}m%n</pattern>
</fullMessageLayout>
<numbersAsString>false</numbersAsString>
</encoder>
</appender>
<logger name="play" level="INFO" />
<logger name="application" level="DEBUG" />
<logger name="reactivemongo" level="INFO" />
<appender name="ASYNCSTDOUT" class="ch.qos.logback.classic.AsyncAppender">
<appender-ref ref="STDOUT" />
</appender>
<root level="WARN">
<appender-ref ref="ASYNCSTDOUT" />
<appender-ref ref="GELF" />
</root>
</configuration>
Mapper:
package utils.logging;
import ch.qos.logback.classic.spi.ILoggingEvent;
import de.siegmar.logbackgelf.GelfFieldMapper;
import java.util.function.BiConsumer;
public class TestMapper implements GelfFieldMapper<String> {
@Override
public void mapField(ILoggingEvent event, BiConsumer<String, String> valueHandler) {
}
}
Stacktrace:
18:53:52,265 |-ERROR in ch.qos.logback.core.model.processor.ImplicitModelHandler - Could not create component [fieldMapper] of type [utils.logging.TestMapper] java.lang.ClassNotFoundException: utils.logging.TestMapper
at java.lang.ClassNotFoundException: utils.logging.TestMapper
at at java.base/java.net.URLClassLoader.findClass(URLClassLoader.java:445)
at at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:592)
at at kamon.instrumentation.sbt.KanelaOnSystemClassLoader.loadClass(KanelaOnSystemClassLoader.java:33)
at at kamon.instrumentation.sbt.play.SbtKanelaRunnerPlay$SbtKanelaClassLoader.loadClass(SbtKanelaRunnerPlay.scala:77)
at at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:525)
at at ch.qos.logback.core.util.Loader.loadClass(Loader.java:132)
at at ch.qos.logback.core.model.processor.ImplicitModelHandler.doComplex(ImplicitModelHandler.java:134)
at at ch.qos.logback.core.model.processor.ImplicitModelHandler.handle(ImplicitModelHandler.java:94)
at at ch.qos.logback.core.model.processor.DefaultProcessor.secondPhaseTraverse(DefaultProcessor.java:241)
at at ch.qos.logback.core.model.processor.DefaultProcessor.secondPhaseTraverse(DefaultProcessor.java:253)
at at ch.qos.logback.core.model.processor.DefaultProcessor.secondPhaseTraverse(DefaultProcessor.java:253)
at at ch.qos.logback.core.model.processor.DefaultProcessor.secondPhaseTraverse(DefaultProcessor.java:253)
at at ch.qos.logback.core.model.processor.DefaultProcessor.traversalLoop(DefaultProcessor.java:90)
at at ch.qos.logback.core.model.processor.DefaultProcessor.process(DefaultProcessor.java:106)
at at ch.qos.logback.core.joran.GenericXMLConfigurator.processModel(GenericXMLConfigurator.java:216)
at at ch.qos.logback.core.joran.GenericXMLConfigurator.doConfigure(GenericXMLConfigurator.java:178)
at at ch.qos.logback.core.joran.GenericXMLConfigurator.doConfigure(GenericXMLConfigurator.java:123)
at at ch.qos.logback.core.joran.GenericXMLConfigurator.doConfigure(GenericXMLConfigurator.java:66)
at at ch.qos.logback.classic.util.DefaultJoranConfigurator.configureByResource(DefaultJoranConfigurator.java:68)
at at play.api.libs.logback.LogbackLoggerConfigurator.configure(LogbackLoggerConfigurator.scala:128)
at at play.api.libs.logback.LogbackLoggerConfigurator.configure(LogbackLoggerConfigurator.scala:75)
at at di.CashboxOnlineComponents.$anonfun$new$1(CashboxOnlineComponents.scala:22)
at at di.CashboxOnlineComponents.$anonfun$new$1$adapted(CashboxOnlineComponents.scala:22)
at at scala.Option.foreach(Option.scala:437)
at at di.CashboxOnlineComponents.<init>(CashboxOnlineComponents.scala:22)
at at ApplicationLoader.load(ApplicationLoader.scala:15)
at at play.core.server.DevServerStart$DevServerApplicationProvider$1.$anonfun$reload$2(DevServerStart.scala:233)
at at play.utils.Threads$.withContextClassLoader(Threads.scala:22)
at at play.core.server.DevServerStart$DevServerApplicationProvider$1.reload(DevServerStart.scala:225)
at at play.core.server.DevServerStart$DevServerApplicationProvider$1.get(DevServerStart.scala:190)
at at play.core.server.AkkaHttpServer.handleRequest(AkkaHttpServer.scala:320)
at at play.core.server.AkkaHttpServer.$anonfun$createServerBinding$1(AkkaHttpServer.scala:224)
at at akka.stream.impl.fusing.MapAsyncUnordered$$anon$31.onPush(Ops.scala:1430)
at at akka.stream.impl.fusing.GraphInterpreter.processPush(GraphInterpreter.scala:542)
at at akka.stream.impl.fusing.GraphInterpreter.execute(GraphInterpreter.scala:423)
at at akka.stream.impl.fusing.GraphInterpreterShell.runBatch(ActorGraphInterpreter.scala:650)
at at akka.stream.impl.fusing.GraphInterpreterShell$AsyncInput.execute(ActorGraphInterpreter.scala:521)
at at akka.stream.impl.fusing.GraphInterpreterShell.processEvent(ActorGraphInterpreter.scala:625)
at at akka.stream.impl.fusing.ActorGraphInterpreter.akka$stream$impl$fusing$ActorGraphInterpreter$$processEvent(ActorGraphInterpreter.scala:800)
at at akka.stream.impl.fusing.ActorGraphInterpreter$$anonfun$receive$1.applyOrElse(ActorGraphInterpreter.scala:818)
at at akka.actor.Actor.aroundReceive(Actor.scala:537)
at at akka.actor.Actor.aroundReceive$(Actor.scala:535)
at at akka.stream.impl.fusing.ActorGraphInterpreter.aroundReceive(ActorGraphInterpreter.scala:716)
at at akka.actor.ActorCell.receiveMessage(ActorCell.scala:579)
at at akka.actor.ActorCell.invoke(ActorCell.scala:547)
at at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:270)
at at akka.dispatch.Mailbox.run(Mailbox.scala:231)
at at akka.dispatch.Mailbox.exec(Mailbox.scala:243)
at at java.base/java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:373)
at at java.base/java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1182)
at at java.base/java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1655)
at at java.base/java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1622)
at at java.base/java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:165)
What am I doing wrong?
What am I doing wrong?
Hard to say, stacktrace is incomplete
Oops, missed one line, edited my previous message ⏫
As the exception says – your class cannot be found on classpath. Based on the given information I cannot see why this happens. You may want to create a very basic example that can be shared via public Github repository...