Enable anti-aliasing for actor "lv_gltest"?
hartwork opened this issue · 6 comments
Hi!
I could be wrong but my impression is that actor "lv_gltest"…
… does not activate anti-aliasing and hence misses out on sharper edges. I'd be curious how much sharper the picture and how much slower the animation gets when activated. I have not looked into details myself yet. @kaixiong what do you think?
Best, Sebastian
PS: It seems like SDL 2 has GL attributes for anti-aliasing but SDL 1 does not plus lv-tool on 0.4.x is not yet setting GL attributes.
@hartwork, SDL 1.2 does support the multisampling attributes. Having anti-aliasing for all of the included OpenGL visualizations would be nice.
Just to go off on a tangent a bit, multi-sampling is the simplest way to achieve AA. it's not the best way to do it today with the wide availability of pixel shaders however. Essentially, multisampling renders at a higher resolution and then downsamples, so it's pretty bad for applications that are already fill rate limited. We don't currently have this problem though.
For more demanding visualizations next time, we can use AMD FSR. It does upscaling instead.
@hartwork, SDL 1.2 does support the multisampling attributes.
@kaixiong seem's like you're right for SDL >=1.2.6 — nice!
Just to go off on a tangent a bit, multi-sampling is the simplest way to achieve AA. it's not the best way to do it today with the wide availability of pixel shaders however. Essentially, multisampling renders at a higher resolution and then downsamples, so it's pretty bad for applications that are already fill rate limited. We don't currently have this problem though.
For more demanding visualizations next time, we can use AMD FSR. It does upscaling instead.
Interesting pointers. Maybe we can have good old bad AA first and then iterate.
@hartwork, yeah let's do it.
That said, my hardware is not good for benchmarking performance here since my graphics card is pretty high end. Maybe it's time to whip out my Raspberry Pi 3b for testing!
Here's a link to the AMD FSR2 project on GitHub for reference.. I think it only supports DirectX or Vulkan though. Porting it to OpenGL should be possible but extra work.
@hartwork, yeah let's do it.
That said, my hardware is not good for benchmarking performance here since my graphics card is pretty high end. Maybe it's time to whip out my Raspberry Pi 3b for testing!
@kaixiong 😄 I'm writing from an i5 ThinkPad X220. That's probably (old and) slow enough to make me notice significant drops in performance, and stay away from fullscreen with non-GL actors.
Here's a link to the AMD FSR2 project on GitHub for reference.. I think it only supports DirectX or Vulkan though. Porting it to OpenGL should be possible but extra work.
Thanks for the link 👍