Publish for MacOS and ARM too
slandelle opened this issue · 20 comments
It would be great to have binaries available on all platforms.
Maybe ask the Netty guys how they do that? ping @normanmaurer
FYI, I have the Netty decoder working nicely based on jvmbrotli (just sad the brotli copy is very old) without having to fork anything, on contrary to your original PR :)
GitHub Actions supports macOS and ARM. I think that's the easiest way to build cross-platform binaries.
Can you share the code of Decoder? I'm curious to know more about it.
It's still a WIP in a private branch I can't open source yet.
I can push a public WIP branch as soon as you have a release available for MacOS and w/ Apache 2 license.
I'll then add tests and contribute upstream to Netty.
Note: I'll only contribute the Decoder (needed for Gatling), not the Encoder.
Perfect makes sense.
It's done. v1.2.0 uses Apache License 2 and supports macOS, Can we work on Netty Brotli support now?
I forgot to change the License type in 1.2.0. Fixed in 1.2.1, sorry for the inconvenience.
No pro, thanks.
Regarding the decoder, the big difference with your original version is that I eagerly pull the decompressed data chunks instead of waiting for the whole payload and get one single possibly huge byte array.
My logic behind the decoder was: 1 Chunk of Brotli
can be up to 16 MB. Whenever I get a ByteBuf
, I will pass it to Decoder and Decoder will tell if decoding was SUCCESSFUL
or NEEDS_MORE_INPUT
.
For NEEDS_MORE_INPUT
, we'll keep aggregating ByteBuf
and pass it to Decoder
until decoding was SUCCESSFUL
or aggregation exceeded 16 MB.
If a file of 2 GB is compressed with Brotli, each chunk could be up to 16 MB only. So we're not the aggregating whole data but chunks.
Does this sound good?
I'm really not a Brotli expert but my use case is mostly web where I would never see such large payloads. So I would be more inclined to use the same model as the existing ZlibDecoders that pull decompressed data as soon as it's available. Moreover, IIRC, your own take did a lot a memory copies, such as https://github.com/netty/netty/pull/10626/files#diff-04ab86c9d58745be5ec7455a7a9e7a011426adb5e556a8c206d132e212e7b037R35-R38.
If you can open PR in Netty, we can work with others and see the best possible way to handle these issues. I am also running out of ideas. :p
Will do.
However, sad news: the MacOS jar is actually empty: https://repo1.maven.org/maven2/com/aayushatharva/brotli4j/native-macos_x86-64/1.2.1/
Life is unfair. :( Lemme debug this now.
Just asking: are you sure the file you're generating has a dylib
extension? The Netty files have a jnilib
one (which is not what System#mapLibraryName
returns, so they have to hard code).
Yes, binary is generated with dylib
.
@slandelle Deployed 1.2.2 on Central after checking each file carefully. It's been a BrotliDay
today. :D
Awesome, thanks a lot! I confirm it works now.
This ticket mentions ARM but I can see only x86_64 binaries at https://search.maven.org/search?q=g:com.aayushatharva.brotli4j
Is there any chance for Linux aarch64 binary too ? Should I open a new ticket ?
Thanks!
@martin-g I've plans for ARM in future but I'm blocked with tons of work these days.
However, if you can do PR for it, I can merge and publish ARM binaries.
Awesome! I'll send you a PR soon!