Double compression when using v1.20 in combination with gorilla/mux CompressHandler
Mattie112 opened this issue ยท 10 comments
To start: I am not exactly sure if this issue belongs here, at https://github.com/gorilla/mux or at https://github.com/gorilla/handlers. I figured to start here as it possible other people are also approaching it from this angle.
The problem:
With v1.19 I could access the /metrics
endpoint in my browser and it would work. Now with v1.20(.1) I get to see gzipped content.
I have traced it back to the usage of using the mux CompressHandler. So why am I reporting it here then? Because it occurs after a new release here so perhaps the bug is somewhere else but someone here has a clue.
Minimal exxample:
main.go
package main
import (
"github.com/gorilla/handlers"
"github.com/gorilla/mux"
"github.com/prometheus/client_golang/prometheus/promhttp"
"net/http"
)
func main() {
router := mux.NewRouter()
router.Handle("/metrics", promhttp.Handler())
http.ListenAndServe(":8080", handlers.CompressHandler(router)) // Breaks
//http.ListenAndServe(":8080", router) // Works
select {}
}
In your go.mod
:
require github.com/prometheus/client_golang v1.20.1
require github.com/gorilla/handlers v1.5.2
require github.com/gorilla/mux v1.8.1
(Tested with go 1.21 and 1.23)
If you open the /metrics
in your browser you will get gzipped output. If you then roll back to v1.19 it works as intended.
I did found this (old) issue on the mux repo: gorilla/handlers#153 so it has been fixed in the past. Does anyone have a clue if the changes made could affect this somehow?
Hey @Mattie112! I'm here to help you with any bugs, questions, or contributions you have. Let's tackle this issue together!
The issue you're encountering with double compression when using CompressHandler
from the gorilla/handlers
package after upgrading to version 1.20.1 of the Prometheus client library is due to the fact that the Prometheus client library now supports compression by default. This means that the /metrics
endpoint will automatically compress the response if the client supports it, as indicated by the Accept-Encoding
header.
In version 1.20.1, the Prometheus client library uses the HandlerOpts{DisableCompression: false}
setting, which enables compression by default. When you wrap the router with CompressHandler
, it detects the already compressed response and applies its own compression again, leading to double compression.
Here's a relevant code snippet from the Prometheus client library:
handler := InstrumentMetricHandler(reg, HandlerForTransactional(mReg, HandlerOpts{DisableCompression: false}))
To avoid this double compression, you can either disable the compression in the Prometheus client library or avoid using CompressHandler
for the /metrics
endpoint. Here are two possible solutions:
-
Disable Compression in Prometheus Client Library:
You can disable the default compression by setting
DisableCompression
totrue
in theHandlerOpts
:router.Handle("/metrics", promhttp.HandlerFor(prometheus.DefaultGatherer, promhttp.HandlerOpts{DisableCompression: true}))
-
Avoid Using
CompressHandler
for/metrics
Endpoint:You can selectively apply
CompressHandler
to other routes but not to the/metrics
endpoint:router := mux.NewRouter() router.Handle("/metrics", promhttp.Handler()) // Apply CompressHandler to other routes if needed http.ListenAndServe(":8080", router)
By implementing one of these solutions, you should be able to avoid the double compression issue and have the /metrics
endpoint return plain text as expected [1][2].
Just to confirm:
router.Handle("/metrics", promhttp.HandlerFor(prometheus.DefaultGatherer, promhttp.HandlerOpts{DisableCompression: true}))
Has no effect for me, it is still double compressed.
And the 2nd option for as far as I know is not possible. I can only add the ConpressHandler to the entire server.
(cool AI service btw!)
(cool AI service btw!)
We're giving it a try, looks useful as a quick first responder :)
It seems like desubot was going in the right direction. DisableCompression
should disable the gzip compression, but I'm a bit disappointed that he suggested exactly the same thing after you said it didn't work.
I tried to reproduce the problem and confirmed! I think we have a bug related to #1496, although not clear to me what the bug is at first glance. When compression is disabled, the client defaults to no compression[1][2].
And the 2nd option for as far as I know is not possible. I can only add the ConpressHandler to the entire server.
I was able to spin up two webservers, one compressed and another without compression with the following snippet
package main
import (
"log"
"net/http"
"github.com/prometheus/client_golang/prometheus"
"github.com/prometheus/client_golang/prometheus/collectors"
"github.com/prometheus/client_golang/prometheus/promhttp"
"github.com/gorilla/handlers"
"github.com/gorilla/mux"
)
func main() {
// Create non-global registry.
reg := prometheus.NewRegistry()
// Add go runtime metrics and process collectors.
reg.MustRegister(
collectors.NewGoCollector(),
collectors.NewProcessCollector(collectors.ProcessCollectorOpts{}),
)
// Expose handlers in separate goroutine.
go func() {
http.Handle("/metrics", promhttp.HandlerFor(reg, promhttp.HandlerOpts{Registry: reg, DisableCompression: true}))
log.Fatal(http.ListenAndServe(":8080", nil))
}()
go func() {
router := mux.NewRouter()
router.Handle("/metrics", promhttp.HandlerFor(reg, promhttp.HandlerOpts{Registry: reg}))
log.Fatal(http.ListenAndServe(":8081", handlers.CompressHandler(router)))
}()
select {}
}
Wow, desubot just erased his second comment after I said it was not a good comment ๐คฏ
Thanks for report and investigation!
With v1.19 I could access the /metrics endpoint in my browser and it would work. Now with v1.20(.1) I get to see gzipped content.
I investigated a bit and it looks like the response is NOT double compressed, it works fine. Specifically gorilla/handlers#157 helps.
The only problem is that with the new promhttp version we correctly
set the response HTTP content-encoding to identity
when we don't compress.
This works incorrectly with gorilla compression layer, which compresses to gzip, but does NOT update response content encoding, which results in:
content-encoding:
identity
content-type:
text/plain; version=0.0.4; charset=utf-8; escaping=values
date:
Fri, 23 Aug 2024 09:42:08 GMT
transfer-encoding:
chunked
vary:
Accept-Encoding
..and then you see browser or clients weird behaviour.
Removing rsp.Header().Set(contentEncodingHeader, encodingHeader)
results in all working correctly with gorilla compression:
content-encoding:
gzip
content-type:
text/plain; version=0.0.4; charset=utf-8; escaping=values
date:
Fri, 23 Aug 2024 09:49:15 GMT
transfer-encoding:
chunked
vary:
Accept-Encoding
Fix
This is actually our bug, not gorilla lib, because they cannot set headers AFTER response is being written (https://github.com/gorilla/handlers/blob/main/compress.go#L118). As a fix we need to:
- NOT set "Content-Encoding" when we don't compress as before (regression), allowing others to cleanly do that for us and change content-encoding.
- Additionally we could also error when "Content-Encoding" is already set in the writer response AND we are compressing our response (failing fast for bugs causing double compression.
@ArthurSens would like to do the fix? ๐ค
(cool AI service btw!)
We're giving it a try, looks useful as a quick first responder :)
It seems like desubot was going in the right direction.
DisableCompression
should disable the gzip compression, but I'm a bit disappointed that he suggested exactly the same thing after you said it didn't work.I tried to reproduce the problem and confirmed! I think we have a bug related to #1496, although not clear to me what the bug is at first glance. When compression is disabled, the client defaults to no compression[1][2].
Yeah, I mean the first post was pretty OK. It was one thing that I did not test yet (although I was quite certain that it was a bug somewhere). But I think it is a nice addition. I did see another response indeed (even though I did not tag the bot). But now I don't see it. Perhaps it took your negative response as an indicator to delete the post? That is a bit strange perhaps?
And the 2nd option for as far as I know is not possible. I can only add the ConpressHandler to the entire server.
I was able to spin up two webservers, one compressed and another without compression with the following snippet
package main import ( "log" "net/http" "github.com/prometheus/client_golang/prometheus" "github.com/prometheus/client_golang/prometheus/collectors" "github.com/prometheus/client_golang/prometheus/promhttp" "github.com/gorilla/handlers" "github.com/gorilla/mux" ) func main() { // Create non-global registry. reg := prometheus.NewRegistry() // Add go runtime metrics and process collectors. reg.MustRegister( collectors.NewGoCollector(), collectors.NewProcessCollector(collectors.ProcessCollectorOpts{}), ) // Expose handlers in separate goroutine. go func() { http.Handle("/metrics", promhttp.HandlerFor(reg, promhttp.HandlerOpts{Registry: reg, DisableCompression: true})) log.Fatal(http.ListenAndServe(":8080", nil)) }() go func() { router := mux.NewRouter() router.Handle("/metrics", promhttp.HandlerFor(reg, promhttp.HandlerOpts{Registry: reg})) log.Fatal(http.ListenAndServe(":8081", handlers.CompressHandler(router))) }() select {} }
That indeed is an option but that would require me to make a code change to all my services (and we have quite a few) :)
Thanks for report and investigation!
With v1.19 I could access the /metrics endpoint in my browser and it would work. Now with v1.20(.1) I get to see gzipped content.
I investigated a bit and it looks like the response is NOT double compressed, it works fine. Specifically gorilla/handlers#157 helps.
The only problem is that with the new promhttp version we
correctly
set the response HTTP content-encoding toidentity
when we don't compress.This works incorrectly with gorilla compression layer, which compresses to gzip, but does NOT update response content encoding, which results in:
content-encoding: identity content-type: text/plain; version=0.0.4; charset=utf-8; escaping=values date: Fri, 23 Aug 2024 09:42:08 GMT transfer-encoding: chunked vary: Accept-Encoding
..and then you see browser or clients weird behaviour.
Removing
rsp.Header().Set(contentEncodingHeader, encodingHeader)
results in all working correctly with gorilla compression:content-encoding: gzip content-type: text/plain; version=0.0.4; charset=utf-8; escaping=values date: Fri, 23 Aug 2024 09:49:15 GMT transfer-encoding: chunked vary: Accept-Encoding
Fix
This is actually our bug, not gorilla lib, because they cannot set headers AFTER response is being written (https://github.com/gorilla/handlers/blob/main/compress.go#L118). As a fix we need to:
- NOT set "Content-Encoding" when we don't compress as before (regression), allowing others to cleanly do that for us and change content-encoding.
- Additionally we could also error when "Content-Encoding" is already set in the writer response AND we are compressing our response (failing fast for bugs causing double compression.
@ArthurSens would like to do the fix? ๐ค
Thank you for your thorough investigation / explanation. Looking forward for a fix. I am really happy that I don't need to make code-changes to all our microservices :)
Thanks for the investigation @bwplotka and apologies for introducing an unintended bug here.
Reading through RFC2616, it seems like we should not set Content-Encoding: Identity
at all.
identity
The default (identity) encoding; the use of no transformation
whatsoever. This content-coding is used only in the Accept-
Encoding header, and SHOULD NOT be used in the Content-Encoding
header.
Hi there,
Just to confirm: this new release fixed the issue for us. Many thanks for the quick response and fix!