sequenceplanner/r2r

Serialize and deserialize messages with rcl serializer

KeplerC opened this issue · 6 comments

Thanks for the great library.

I am currently developing an application that gets ROS2 messages and sends the serialized messages over the network. I noticed r2r uses serde serialization which may generate large and inefficient message for types like sensor_msgs/Image. I am wondering if there were any interfaces for rcl's serializer, something like

static rclcpp::Serialization<serialization::msg::Mytypes> serializer;
serializer.serialize_message(&message, &serialized_msg);
m-dahl commented

Hi. Thanks for the kind words.

No there is no interface to that functionality. If you need it feel free do to some digging and make a PR. But my suggestion would be to use the bincode crate for serialization, it sounds like that should solve your immediate issues.

fn main() {
    use r2r::std_msgs::msg::String as StdString;
    let target: Option<StdString> = Some(StdString { data: "hello world".into() });

    let encoded: Vec<u8> = bincode::serialize(&target).unwrap();
    let decoded: Option<StdString> = bincode::deserialize(&encoded[..]).unwrap();
    assert_eq!(target, decoded);
}

Hi @m-dahl, thanks.

I actually tried bincode and it seems the bincode makes it even larger

hello world string:

bincode 45
serde_json 21

a small CompressedImage of size 21168

bincode 169506
json 42416

I will take a deeper dive on the serialization

m-dahl commented

Hi again,

Hmm, I think something is wrong on your end. I just tried this:

fn load_png() -> Vec<u8> {
    use std::fs::File;
    use std::io::Read;

    let filename = "rustacean-orig-noshadow.png";
    let mut f = File::open(filename).expect("no file found");
    let metadata = std::fs::metadata(filename).expect("unable to read metadata");
    println!("input file size: {}", metadata.len());
    let mut buffer = vec![0; metadata.len() as usize];
    f.read(&mut buffer).expect("buffer overflow");
    buffer
}

fn main() {
    let data = load_png();
    use r2r::sensor_msgs::msg::CompressedImage;
    let mut img = CompressedImage::default();
    img.data = data;
    let encoded_bincode: Vec<u8> = bincode::serialize(&img).unwrap();
    let encoded_json: String = serde_json::to_string(&img).unwrap();

    println!("json len: {}", encoded_json.as_bytes().len());
    println!("bincode len: {}", encoded_bincode.len());

    // sanity check.
    let decoded: CompressedImage = bincode::deserialize(&encoded_bincode[..]).unwrap();
    assert_eq!(img, decoded);
}

I get

input file size: 58413
json len: 208504
bincode len: 58445

I.e, basically no overhead compared to the raw input using bincode.

Thanks for the prompt response. Bincode works on my end.

I see where I did wrong:
I used subscribe_untyped instead of subscribe::<r2r::sensor_msgs::msg::CompressedImage>; the subscribe_untyped directly returns a serde_json::Value.

pub sender: mpsc::Sender<Result<serde_json::Value>>,

I am wondering if the return type of WrappedNativeMsgUntyped is more straightforward than serde_json::Value?

m-dahl commented

Great that it worked. I think your suggestion can make sense for the subscriber. On the publisher end it would make things slightly more awkward since you would need to create a message before you can fill in its fields (now the publisher knows the type already).

On the other hand I don't know what you would do with the messages without going to some easily introspectable format like json. I guess for your usecase it can make sense to skip the serialization, if you need to dynamically set up these streams where you pass the data on without knowing the types beforehand(?).

Lets think about it.

Thanks. I created a fork of the repo and replaced serde_json with bincode. I think you are right, my use case is special and does not require creating a new message on the publisher side.

The reason I used bincode instead of directly do it without serialization is the endianess if the message goes across multiple machines. It seems bincode takes care of that.

Since the problem is mostly resolved, I therefore close this issue. Thanks for the help!