Memory corruption in "shooter" example
Closed this issue · 3 comments
When running the shooter example on 64-bit Linux, the palettes get corrupted after a few seconds. The game sometimes segfaults, but not always. The final colours, and whether the game crashes or not, are both seemingly random (I guess nondeterminism coming from the addresses given to the process by the OS).
It seems that one of the latest updates has broken something that causes this corruption. This sample has always worked without problem, but right now something isn't working well. I'll fix this ASAP.
Thanks for pointing this one!
Thanks for the quick fix!
this fix makes impossible to mix tilesets with different palettes in a single layer. If the palettes of tilesets don't match, colors of secondary tilesets will be wrong. It can't be fixed as in Tilengine architecture, palette is a property of the layer, not of the tileset.
Do you think there are any architectural changes that could allow this to work in the future? The multi-tileset feature is definitely less useful this way.
Some possibilities I can imagine:
- Give the layer a master palette derived from all tilesets.
- Pros: easy to implement
- Cons: too magical, you have to offset all the pixel data because the secondary palettes don't start from 0 anymore. (Therefore it becomes impossible to share a tileset between multiple layers.)
- Give layers multiple palettes, add "palette" bits to tiles, similar to the existing "tileset" bits.
- Pros: similar to how real consoles work (e.g. clouds & bushes in SMB1 have the same tiles but different palettes).
- Cons: you might end up with duplicate palettes in the same layer. Probably a big change, idk how possible it is.
Thanks for your feedbak!
After a few hours of sleep, I think I can have a fix for the multiple palettes issue. The main problem is that many features in tilengine -tile priority, blending, mosaic...- use a temporary line buffer where the colour indexes are stored, and then rendered in a second pass. This buffer doesn't hold any information about the palette of each pixel, just the 8-bit index, because the palette is a layer property, and this palette is used to fetch the final RGB values of pixels.
Should this line buffer be 32-bit RGB with the actual colour fetched from the palette before storing it, and providing new 32 -> 32 blitters for the second pass in addition to the actual 8 -> 32 should work without degrading performance. I have to look carefully for unforeseen side effects, but in theory it should work.
For the first pass, the handling of palettes sould be:
- layer palette is made optional but with higher priority, and is NULL by default (no layer palette)
- if layer palette is null, fetch pixel colour from tile's tileset palette
- if layer palette is not null, fetch pixel colour from layer palette
This would keep compatibility of current global palette and allow new per tileset palettes. Of course, with this approach, raster palette effects won't work well with multiple palettes per line, because this effect works by setting the layer palette, that has priority over tileset palettes. But at least the conflict is delayed to the combination of both features (multiple tilesets and colour raster effects at once). I think it's a much better compromise.
Classic 2D systems don't have layer nor tileset palette, but a fixed number of global 16-colour palettes that are shared across all tiles. Colour raster effects are accomplished by changing the colours of these palettes, instead of setting a reference like Tilengine does. Tilengine is much more flexible and artist friendly in this approach, as it doesn't force developers to work with a fixed number of shared palettes for all items, but instead it allows to have each element have its own local palette