/schema_registry_converter

A crate to convert bytes to Value and the other way around in a way Compatible with the Confluent Schema Registry.

Primary LanguageRustApache License 2.0Apache-2.0

#schema_registry_converter

Build Status codecov Crates.io Crates.io docs.rs

This library provides a way of using the Confluent Schema Registry in a way that is compliant with the usual jvm usage. Consuming/decoding and producing/encoding is supported. It's also possible to provide the schema to use when decoding. When no schema is provided, the latest schema with the same subject will be used. As far as I know, it's feature complete compared to the confluent java version. As I'm still pretty new to rust, pr's/remarks for improvements are greatly appreciated.

Consumer

For consuming messages encoded with the schema registry, you need to fetch the correct schema from the schema registry to transform it into a record. For clarity, error handling is omitted from the diagram.

Consumer activity flow

Producer

For producing messages which can be properly consumed by other clients, the proper id needs to be encoded with the message. To get the correct id, it might be necessary to register a new schema. For clarity, error handling is omitted from the diagram.

Producer activity flow

Getting Started

schema_registry_converter.rs is available on crates.io. It is recommended to look there for the newest and more elaborate documentation.

[dependencies]
schema_registry_converter = "0.3.2"

...and see the docs for how to use it.

Example

extern crate rdkafka;
extern crate avro_rs;
extern crate schema_registry_converter;

use rdkafka::message::{Message, BorrowedMessage};
use avro_rs::types::Value;
use schema_registry_converter::Decoder;
use schema_registry_converter::Encoder;
use schema_registry_converter::schema_registry::SubjectNameStrategy;


fn get_value<'a>(
    msg: &'a BorrowedMessage,
    decoder: &'a mut Decoder,
) -> Value{
    match decoder.decode(msg.payload()){
    Ok(v) => v,
    Err(e) => panic!("Error getting value: {}", e),
    }
}

fn get_future_record<'a>(
    topic: &'a str,
    key: Option<&'a str>,
    values: Vec<(&'static str, Value)>,
    encoder: &'a mut Encoder,
) -> FutureRecord<'a>{
    let subject_name_strategy = SubjectNameStrategy::TopicNameStrategy(topic, false);
    let payload = match encoder.encode(values, &subject_name_strategy) {
        Ok(v) => v,
        Err(e) => panic!("Error getting payload: {}", e),
    };
    FutureRecord {
        topic,
        partition: None,
        payload: Some(&payload),
        key,
        timestamp: None,
        headers: None,
    }
}

fn main() {
    let mut decoder = Decoder::new(SERVER_ADDRESS);
    let mut encoder = Encoder::new(SERVER_ADDRESS);
    //somewhere else the above functions can be called
}

Relation to related libraries

The avro part of the conversion is handled by avro-rs. As such, I don't include tests for every possible schema. While I used rdkafka in combination to successfully consume from and produce to kafka, and while it's used in the example, this crate has no direct dependency on it. All this crate does is convert [u8] <-> avro_rs::types::Value.

Tests

Due to mockito, used for mocking the schema registry responses, being run in a separate thread, tests have to be run using --test-threads=1 for example like cargo +stable test --color=always -- --nocapture --test-threads=1

License

This project is licensed under either of

at your option.

Contribution

Unless you explicitly state otherwise, any contribution intentionally submitted for inclusion in Schema Registry Converter by you, as defined in the Apache-2.0 license, shall be dual licensed as above, without any additional terms or conditions.