M0nica/httriri

Decrease page load time by optimizing gifs.

SijmenHuizenga opened this issue · 6 comments

I 😻 the project! When I first visited the page, it took like 15 seconds to load... That makes be a bit 😿 The problem is the size of the gifs: at the moment 35MB. With the expected addition of many more gifs, this will only increase. I propose to optimize all gifs in two ways:

  1. Resize large gifs to 300px width. Most gifs already have a width smaller than 300, but some are larger. Reducing the width of these larger gifs won't impact the user experience since the gifs are shown at 300px anyway.

  2. Compressing the gifs lossy. Using a website like https://ezgif.com/optimize/ we can take the optimize the images to decrease size without much visual impact. This can drastically decrease the size. I propose to use the Lossy Gif optimization method and set the compression level to 100.

As an example, here is 101 - the largest gif - before applying these optimizations: (4.6MB)

101

And after reducing the width and compression (1.7MB)

ezgif com-optimize

I would be happy to help. Let's discuss!

@SijmenHuizenga I agree that the images could use some further optimization and would be happy to have someone work on it. I'd like to automate this process as much as possible.

We have imagebot set up but "By default imgbot is set to lossless compression, that's the reason of low compression rates, you can tweak that to further reduce gif's size" so I think it would be helpful to re-visit that in the next round image of optimizations. #33 (comment)

I just pushed an update that will hopefully make the Imgbot compression a bit more aggressive on the gifs to further reduce the gif file sizes. 1deff65 which seems to have reduced the image size by 33% according to the latest PR opened by Imgbot: #69

I am still inspecting the quality of the compressions.

It will still be helpful to have images with a larger width than 300px resized to a width of 300px

Great to see Imgbot already supports compression!

Unfortunately Imgbot doesn't support resizing and there doesn't seem to be another bot that does it. I am not able to file a pr for Imgbot because C#.

As an alternative we can write a simple workflow like this to resize large images. Downside of separating compression and resizing into two process is that they will break each other when they run at the same time. They need to run sequentially and I don't think that is possible for workflows and bots. We could implement compression in our own workflow (together with resizing) and no longer use Imgbot.

Or we go the manual route: document how to resize and manually resize existing images.

@SijmenHuizenga thanks for putting that workflow together. Right now img bot runs post-merge once the optimization savings reach a certain threshold.

So if we add the GitHub action then it will resize the images and then at some point afterwards they will be optimized. Img bot shouldn’t be conflicting with this script since it’s run on PRs and img bot should happen after PR is merged.

Does that order of operations make you feel comfortable with separating out the resizing and optimization for now?

Oooooh I did not realize Imgbot runs after merge. Yes having this workflow together would work perfectly! I will create a pr

So apparently this thing did not work. See #74 and #77

My conclusion: Automatically adding a commit on a forked branch during a pull-request workflow is not possible with out-of-the-box tooling.

How to go from here? Some ideas, none perfect:

Only check and report too-large images. Then people have to manually resize the gifs. It's more difficult for humans and simpler for computers. In the meantime we can bump the issue in Imgbot to support resizing?

Integrate ImgBot functionality into our own workflow? Imgbot compression consists of running Gifscale and running Imagick. We can run those commands from a workflow, together with resizing, run on each commit and master and commit into master when needed? Downside is that we do not get such a nice pr's like gifscale does and committing on master is just ugly.

====

Maybe a better idea: Instead of optimize the images in the git repo, optimize the images when the website is built for deployment. So in the repo we keep the original size images and just in time we resize and optimize them. Like that, if we decide that we want all gifs to be displayed at 350px we can just change a single number without having to update all the gifs. I'm not sure how difficult this is since I don't see how the website is deployed at the moment. Can you elaborate on this @M0nica ?