grimfang4/SDL_FontCache

Serialize/deserialize font cache to/from memory.

AlexanderKotliar opened this issue · 11 comments

Hello, Jonathan! Thank You for a useful library!

On mobiles it takes a few seconds to generate each font cache. So, it would be nice to have a way to save and load it in memory to be flexible. One can add these memory blocks to the whole application's cache and store it to disk.

Best regards,
Alexander

You're welcome! So your suggestion is a sort of format specification so that we can save and load the internal font cache? I like that idea and it would certainly come in handy.

SDL_FontCache uses a set of textures (the font cache levels) to store the glyph graphics and it stores a map of position rects for each loaded glyph. We would need some text format saver/loader and an image saver/loader or else delegate to the application by providing access to these data objects. I definitely prefer the second approach to keep the dependencies clean.

Text saver/loader is too big for such little library.

Can I iterate and get/set internal data by FC_GetGlyphCacheLevel/FC_SetGlyphCacheLevel and FC_GetGlyphData/FC_SetGlyphData?
What is cache level?
What range has codepoint?

FC_GetNumCacheLevels() will tell you how many textures the font is holding. Then use FC_GetGlyphCacheLevel() to get each of them.

FC_SetGlyphCacheLevel() can be used when loading to tell the font to use a particular texture you've loaded.

There is not currently a way to get all of the glyph data, so we should add that. Right now, you'd have to know which codepoints have been loaded already in order to use FC_GetGlyphData(). The range for codepoints can be any 4-byte UTF-8 sequence (endianness might matter). Once you have all of that data, you would use FC_MakeGlyphData() and FC_SetGlyphData() for each codepoint.

Alright, I just pushed 51f588e. You should be able to use FC_GetNumCodepoints() to size an array, then FC_GetCodepoints() to fill the array with all of the loaded codepoints. I also exposed FC_UploadGlyphCache() so you can pass in SDL_Surfaces instead of textures and the library will do the conversion. This will make it possible to save/load from memory (or other, via SDL_rwops).

Thank You for quick reply, Jonathan!

There's no way to get SDL_Surface from SDL_Texture.
I catch surfaces from the call back just before SDL_FreeSurface. It works, but new symbols isn't rendered, may be due to some internal FC_Font's states was not saved.
I found compromise solution for now - empty loading string.

SDL_GetHint return 0 when no hint is set, so we need to check:

old_filter_mode[256]; // Save it so we can change the hint value in the meantime
char* mode = SDL_GetHint(SDL_HINT_RENDER_SCALE_QUALITY);
(old_filter_mode, 255, "%s", mode ? mode : "");
...
SDL_SetHint(SDL_HINT_RENDER_SCALE_QUALITY, strlen(old_filter_mode) ? old_filter_mode : 0);

I increase buffer size because there are may be such a hint like "linear" etc.

Oh yeah, you can use SDL_LockTexture() to get pixels into a surface, but it's not guaranteed to be the latest texture data. You could use a temporary render target and do SDL_RenderReadPixels() from it, which should assure the latest data. I think that's how most people would save an SDL_Texture to file?

That sounded like a challenge, so I wrote up this function. It mostly does what I expected, though reuploading displays some sort of precision drift. Something is off a little bit.

SDL_Surface* copy_texture_to_surface(SDL_Renderer* renderer, SDL_Texture* texture)
{
    if(renderer == NULL || texture == NULL)
        return NULL;

    // Ensure texture is ready for copying
    SDL_SetTextureColorMod(texture, 255, 255, 255);
    SDL_SetTextureAlphaMod(texture, 255);

    // Figure out texture pixel format
    Uint32 format;
    int w, h;
    if(SDL_QueryTexture(texture, &format, NULL, &w, &h) < 0)
        return NULL;

    // Prepare surface pixel format
    int bpp;
    Uint32 Rmask, Gmask, Bmask, Amask;
    if(!SDL_PixelFormatEnumToMasks(format, &bpp, &Rmask, &Gmask, &Bmask, &Amask))
        return NULL;

    // Create temp render target
    SDL_Texture* temp = SDL_CreateTexture(renderer, format, SDL_TEXTUREACCESS_TARGET, w, h);
    if(temp == NULL)
        return NULL;

    // Copy the texture
    SDL_SetTextureBlendMode(temp, SDL_BLENDMODE_NONE);
    SDL_SetRenderTarget(renderer, temp);
    SDL_RenderCopy(renderer, texture, NULL, NULL);

    SDL_Surface* surface = SDL_CreateRGBSurface(SDL_SWSURFACE, w, h, bpp, Rmask, Gmask, Bmask, Amask);
    if(surface == NULL)
    {
        SDL_DestroyTexture(temp);
        return NULL;
    }

    // Read the temp pixels
    if(SDL_RenderReadPixels(renderer, NULL, format, surface->pixels, surface->pitch) < 0)
    {
        SDL_FreeSurface(surface);
        SDL_SetRenderTarget(renderer, NULL);
        SDL_DestroyTexture(temp);
        return NULL;
    }

    SDL_SetRenderTarget(renderer, NULL);
    SDL_DestroyTexture(temp);
    return surface;
}

It looks impressive!
It needs to clear render target:
SDL_SetRenderTarget(renderer, temp);
SDL_SetRenderDrawColor(renderer, 0, 0, 0, 255);
SDL_RenderClear(renderer);
SDL_RenderCopy(renderer, texture, NULL, &rect);

And resulting surface is flipped by y. How I can flip it back?

You could use SDL_RenderCopyEx() with a vertical flip of the texture render step. Otherwise, you can do it by hand by swapping scanlines in the SDL_Surface. I have to do this in SDL_gpu:
https://github.com/grimfang4/sdl-gpu/blob/master/src/renderer_GL_common.inl#L2428

Thank You, Jonathan!

Now, cache saving is working!
Although I added FC_Get/SetLastGlyph to save current position in texture.

But one problem with copy_texture_to_surface: it slightly blur the image. After a few times small fonts degenerate.
SDL_FontCache creates textures with SDL_SetHint(SDL_HINT_RENDER_SCALE_QUALITY, "0");
At the begin of copy_texture_to_surface:I set it too, but it doesn't help.
How can I fix it?

I'm not really sure why this is. It could be a subpixel offset issue? I don't have the time right now to tinker with it.