Question for Afterburner OptimizedBeanPropertyWriter
BOFA1ex opened this issue · 8 comments
[QUESTION]
why using
broken
andfallbackWriter
to invoke serializeAsField.
value = this._propertyAccessor.xxxGetter(bean, this._propertyIndex);
JsonSerializer<?> ser = this._serializer;
ser.serialize(value, gen, prov);
Before Afterburner module loading with beanSerializerModifier.
public class AfterburnerModule extends Module implements Serializable {
public void setupModule(Module.SetupContext context) {
context.addBeanSerializerModifier(new SerializerModifier(cl));
}
}
There was another beanSerializerModifier declare#changeProperties, like below.
@Override
public List<BeanPropertyWriter> changeProperties(SerializationConfig config, BeanDescription beanDesc, List<BeanPropertyWriter> beanProperties) {
for (BeanPropertyWriter property : beanProperties) {
if (registerJavaTypes.contains(property.getType().getRawClass())) {
continue;
}
property.assignSerializer(shapeSerializerMap.get(ofNullable(property
.findPropertyFormat(config, beanDesc.getBeanClass()))
.map(JsonFormat.Value::getShape)
.orElse(ANY))
);
}
return beanProperties;
}
it will be set broken = true
on OptimizedBeanPropertyWriter
, like below
public void assignSerializer(JsonSerializer<Object> ser) {
super.assignSerializer(ser);
if (this.fallbackWriter != null) {
this.fallbackWriter.assignSerializer(ser);
}
if (!SerializerUtil.isDefaultSerializer(ser)) {
this.broken = true;
}
}
and then jump to invoke this.fallbackWriter.serializeAsField(bean, gen, prov);
using reflect#invoke
or not accessor#invoke
.
if (this.broken) {
this.fallbackWriter.serializeAsField(bean, gen, prov);
} else {
Object value;
try {
value = this._propertyAccessor.objectGetter(bean, this._propertyIndex);
} catch (Throwable var8) {
this._handleProblem(bean, gen, prov, var8, false);
return;
}
if (value == null) {
if (this._nullSerializer != null) {
gen.writeFieldName(this._fastName);
this._nullSerializer.serialize((Object)null, gen, prov);
} else if (!this._suppressNulls) {
gen.writeFieldName(this._fastName);
prov.defaultSerializeNull(gen);
}
} else {
JsonSerializer<Object> ser = this._serializer;
if (ser == null) {
Class<?> cls = value.getClass();
PropertySerializerMap map = this._dynamicSerializers;
ser = map.serializerFor(cls);
if (ser == null) {
ser = this._findAndAddDynamic(map, cls, prov);
}
}
Declare annotation @JacksonStdImpl
, it's working for skipping the broken = true
public static boolean isDefaultSerializer(JsonSerializer<?> ser) {
if (ser == null) {
return true;
} else if (ClassUtil.isJacksonStdImpl(ser)) {
return !(ser instanceof ToStringSerializer);
} else {
return false;
}
}
But there has an another problem
For example, StringMethodPropertyWriter
value
was skipped serialize.
public final void serializeAsField(Object bean, JsonGenerator gen, SerializerProvider prov) throws Exception {
if (this.broken) {
this.fallbackWriter.serializeAsField(bean, gen, prov);
} else {
String value;
try {
value = this._propertyAccessor.stringGetter(bean, this._propertyIndex);
} catch (Throwable var6) {
this._handleProblem(bean, gen, prov, var6, false);
return;
}
if (value == null) {
if (this._nullSerializer != null) {
gen.writeFieldName(this._fastName);
this._nullSerializer.serialize((Object)null, gen, prov);
} else if (!this._suppressNulls) {
gen.writeFieldName(this._fastName);
prov.defaultSerializeNull(gen);
}
} else {
if (this._suppressableValue != null) {
if (MARKER_FOR_EMPTY == this._suppressableValue) {
if (value.length() == 0) {
return;
}
} else if (this._suppressableValue.equals(value)) {
return;
}
}
gen.writeFieldName(this._fastName);
// why not append `this. _serializer.serialize(value, gen, prov);`
gen.writeString(value);
}
}
}
This handling was added by @stevenschlansker few years back to work around issues for cases where failure by Afterburner to optimize class handling caused problems -- the idea is/was to just use regular serialization/deserialization when the (de)serializer Afterburner constructed seemed broken.
Check for "is standard Jackson serializer" is made to make Afterburner avoid optimizing such implementations: they are assumed optimized. Your classes should not usually be considered such classes because there may be other side effects.
Beyond that, it'd make sense for you to explain the specific problem you are having.
Afterburner (and to some degree Blackbird) are bit fragile by necessity, as they need to override low-level details; and sometimes changes to core Jackson databind can cause issues where overrides do not have all the same handling as the underlying default implementation.
One other thing: yes, for anything with BeanSerializerModifier
, you may easily get into conflict with Afterburner's handling. That's part of fragility.
I think we had the idea of maybe introducing an annotation for allowing marking of classes so Afterburner would skip them (similar to "don't change standard Jackson serializers"), to avoid such cases. Assuming this is what is happening here.
Thanks for answer, it's working for sure that using annotation @JacksonStdImpl
to resolve it problem.
But on ObjectMethodPropertyWriter
/ObjectFieldPropertyWriter
, it will use serializer
after propertyAccessor#getterValue
,
So why not use the same code to handle primitive/string property writer.
The idea with primitive/String writers is that if (and only if) there is nothing unusual -- handling uses the default Jackson serializer, as per annotation -- there is no need to add the overhead of calling JsonSerializer
, or check if one exists. But instead use what is the default behavior.
You can think of it as inlining of serialization for that set of properties with no custom handling.
If there is some sort of customization, then optimized BeanPropertyWriter
should NOT get constructed or used and default one (which will use whatever serializer is configured) called instead.
The intent for "broken" is to allow fallback dynamically; it is not a pretty solution but solved someone's problem when getting added.
My problem here however is that I don't feel you have explained the actual problem you have -- just pointing to specific code that you suspect is related.
Would it be possible to have a test case that shows the actual problem?
To make entity render with template, property type with charsequence or declare Property#Type.STRING, it should be serialize the object escaped.
@Target(ElementType.FIELD)
@Retention(RetentionPolicy.RUNTIME)
@Documented
public @interface Property {
String value() default "";
boolean required() default false;
Type type() default Type.ANY;
enum Type {
/**
* Marker enum value that indicates "whatever" choice, default impl for typing adapt
* @implNote primitive/number/boolean #function#identify
* @implNote charSequence #escape
* @implSpec nebula-module support for data-type(date/time/datetime/duration)
*
* @implNote nebula#datetime datetime(\"2022-05-12T17:53:24\")
* @implNote nebula#date date(\"2022-05-12\")
* @implNote nebula#time time(\"17:53:24\")
* @implNote nebula#duration duration({years: 12, months: 5, days: 14, hours: 16, minutes: 12, seconds: 70})
*/
ANY,
/**
* it => number(it),
* @implNote primitive#identify
* @implNote date => timestamp(date)
*/
STRING,
/**
* it => \"it\"
* @implNote primitive/charSequence/object escape(it)
*/
NUMBER
}
}
public class xx extends AnnotationIntrospector {
@Override
public JsonFormat.Value findFormat(Annotated memberOrClass) {
final Property property = _findAnnotation(memberOrClass, Property.class);
final Pattern pattern = _findAnnotation(memberOrClass, Pattern.class);
final JsonFormat.Shape shape = property == null ? ANY :
JsonFormat.Shape.valueOf(property.type().name());
return pattern == null ? forShape(shape) :
new JsonFormat.Value(pattern.value(), shape, pattern.locale(), pattern.timezone(), empty(), pattern.lenient().asBoolean());
}
}
@Override
public List<BeanPropertyWriter> changeProperties(SerializationConfig config, BeanDescription beanDesc, List<BeanPropertyWriter> beanProperties) {
for (BeanPropertyWriter property : beanProperties) {
// property will find JsonFormat#shape, and it will decide to use serializer.
property.assignSerializer(shapeSerializerMap.get(ofNullable(property
.findPropertyFormat(config, beanDesc.getBeanClass()))
.map(JsonFormat.Value::getShape)
.orElse(ANY))
);
}
return beanProperties;
}
static class PhanesAnyShapeSerializer extends StdSerializer<Object> {
protected PhanesAnyShapeSerializer() {
super(Object.class);
}
@Override
public void serialize(Object o, JsonGenerator g, SerializerProvider provider) throws IOException {
g.writeString(o.getClass().isPrimitive() || Number.class.isAssignableFrom(o.getClass()) ? o.toString() : "\"" + o + "\"");
}
}
static class PhanesStringShapeSerializer extends StdSerializer<Object> {
protected PhanesStringShapeSerializer() {
super(Object.class);
}
@Override
public void serialize(Object o, JsonGenerator g, SerializerProvider serializerProvider) throws IOException {
g.writeString("\"" + o + "\"");
}
}
static class PhanesNumberShapeSerializer extends StdSerializer<Object> {
protected PhanesNumberShapeSerializer() {
super(Object.class);
}
@Override
public void serialize(Object value, JsonGenerator g, SerializerProvider provider) throws IOException {
g.writeString(value.toString());
}
}
Ok. So, this is getting closer... but I think it's missing shapeSerializerMap
to show that logic as well as other parts to make it runnable. So I cannot reproduce the issue on my end unfortunately.
But the problem here, I think, is that code is assigning serializer on PropertyWriter that Afterburner has created after thinking that there is nothing special. Instead, Afterburner should be prevented from creating optimized instance altogether.
I think what you would need to do is one of:
- ensure that your modifier runs before Afterburner's. This means registering your module after (or is it before? I never remember precedence) OR,
- replace the whole
BeanPropertyWriter
to NOT use optimized variant -- it will not be optimized anyway, there is no benefit -- that is, construct fresh new one
It would still sort of help if there was a self-contained test, with simplified form from above.