BRUHItsABunny/go-android-firebase

Issue when many incoming messages

developerfromjokela opened this issue · 7 comments

panic: proto.Unmarshal: proto: cannot parse invalid wire-format data

goroutine 39 [running]:
github.com/BRUHItsABunny/go-android-firebase/client.(*MTalkCon).loop(0xc000214240)
/go/pkg/mod/github.com/!b!r!u!h!its!a!bunny/go-android-firebase@v0.1.0/client/mtalk.go:92 +0x65
created by github.com/BRUHItsABunny/go-android-firebase/client.(*MTalkCon).Connect in goroutine 19
/go/pkg/mod/github.com/!b!r!u!h!its!a!bunny/go-android-firebase@v0.1.0/client/mtalk.go:84 +0x30a

@developerfromjokela - experiencing the same issue. Did you ever find a resolution?

What rate of incoming messages does it take to recreate this?

I haven't been able to recreate this, any help with that will allow me to check it out.

I think the issue is due to TCP packet fragmentation which becomes more likely as packet rate increases. Client would need to buffer and reassemble partial message fragments to avoid this.

Not saying that it isn't TCP fragmentation, it very well can be but isn't TCP fragmentation usually handled on the OS TCP stack?

My knowledge on raw TCP isn't very extensive, I will do some more research

I used a bad term. You are right that it is likely not IP fragmentation. Let me be more precise.

In general you cannot assume that when you read data from a socket, you will get exactly one protobuf message. gRPC servers buffer output data: below a low water mark (when total service utilisation is low), the server will combine small consecutive packets into single TCP packets to improve efficiency. Above a high water mark (when total service utilisation is high) the server will split packets to get them onto the wire faster, to reduce pressure on the buffer. This latter situation is what is happening occasionally and causing the panic. But you need to handle cases where you might receive less than one message, one message, or more than one message at a time.

@developerfromjokela - experiencing the same issue. Did you ever find a resolution?

Not unfortunately, I don't use it because of this issue.

But @BRUHItsABunny it would be nice to fix this issue, my usecase involved receiving many push notifications at the same time in some situations, so quite busy.