svenstaro/miniserve

[FR] Serve pre-compressed gzip / brotli sidecar files

Opened this issue · 4 comments

It would be nice if Miniserve could use sidecar gzip/brotli files that have been compressed ahead of time.

Caddy is able to do something similar using the precompressed option in the file_server stanza: https://caddyserver.com/docs/caddyfile/directives/file_server

No worries if it's out of scope for this project, just thought it would be nice. Thank you for your work on miniserve! :)

So basically you go miniserve myfile.zip and it serves the contents of that?

I think the idea is that if there is a myfile.zip.br, it can be served as myfile.zip, Content-Encoding: br without miniserve having to do the encoding itself.

I wasn't sure what sidecar meant, but the caddy website seems to suggest that for a myfile.zip file, caddy checks if there is a myfile.zip.gz or myfile.zip.br "sidecar" file, and it serves one of the sidecars if the client accepts that encoding.

I think the idea is that if there is a myfile.zip.br, it can be served as myfile.zip, Content-Encoding: br without miniserve having to do the encoding itself.

I wasn't sure what sidecar meant, but the caddy website seems to suggest that for a myfile.zip file, caddy checks if there is a myfile.zip.gz or myfile.zip.br "sidecar" file, and it serves one of the sidecars if the client accepts that encoding.

That's exactly how it's done by Caddy and Nginx and others, yeah, and the main benefit of the compressed-sidecar-file approach is indeed that it gives the server a means to send compressed responses to a HTTP client without the additional server CPU time needed to actually compress each individual source file on-the-fly. Nginx even splits its built-in gzip support into a "static" module and a "filter" module so that webmasters don't have to use both if they don't want to, and Google does the same thing with their ngx_brotli module.

I definitely think this could be very useful for files that compress well into these two formats. The biggest downside is that doing it the sensible way means you'll still have to keep the uncompressed files on hand for clients like wget that don't have any built-in HTTP decompression, and that in turn means higher storage requirements overall. Usually this isn't hard to justify since the files you're working with are typically not that big in the first place, but it could get out-of-hand very quickly if you're using miniserve to offer up a bunch of .iso's, for example.

Another not-so-sensible way to do it, however, would be to only have the .gz/.br files on the server, and if a client tries to fetch a decompressed file without supporting the required Content-Encoding then miniserve does one of three things:

  • refuses to send anything and throws an HTTP 406 error to the client
  • serves the .gz/.br file anyway, assuming the user will be able to decompress it manually after the download
  • on-the-fly decompresses the .gz/.br file and forwards the uncompressed data to the client (probably the least sensible approach of all, given how much extra CPU time miniserve would need just to serve a single request)

Either way, I'd certainly like to see this implemented in miniserve, if for no other reason than to match what's already available on the other HTTP servers mentioned above.

Serving files from disk is the responsibilty of a dependency, actix-files, and it would probably not be easy to make it serve pre-encoded files when it doesn't support it itself. However, there is a PR (actix-web#2615) that adds the feature, but it has been sitting unreviewed for nearly a year. It should be simple for miniserve to add support if/when that PR merges.