Custom parameter on fields?
Closed this issue · 1 comments
Hi,
I am working on a new project that will use kafka with avro schemas. The lib you have developed here looks great! This is not an issue report - just a query/idea.
Now, in addition to using schemas I am considering to encrypt some of the data before it goes into kafka (i.e. some fields). And that is of course easy to do in both producer and consumer of certain topics, but my instinct says that: “Hmm, if we could define in the fields array of the schema definition which fields that should be encrypted/decrypted this could be “automatic””.
I’m new to a lot of the details here, but skimmed the http://avro.apache.org/docs/current/spec.html and did not see any obvious ways/methods to add additional/optional parameters to a field.
Ideally, since we’re contemplating a per record id encryption, one would be able to say something like:
{
"namespace": "example.avro",
"type": "record",
"name": "User",
"fields": [
{"name": "id", "type": "int"},
{"name": "name", "type": "string"},
{"name": "secretmessage", "type": "string", "encryptionKeyFieldName": "id"},
{"name": "favorite_number", "type": ["int", "null"]},
{"name": "favorite_color", "type": ["string", "null"]}
]
}
I.e. the data in “secretmessage” is encrypted and the value of the “encryptionKeyFieldname” is “id” so effectively we have an symmetric encryption key somewhere in the system that is a key->value from user.id to key. The name “encryptionKeyFieldname” is not good, but just to show/explain.
Has anyone thought about this before? Any ways of adding such details to the schema without breaking the avro standard? All input is appreciated.
Just wanted to close this. Discussed on confluent slack and while there is no restriction, it's not recommended. So will go with another approach.