Thinkofname/steven

Corrupted graphics on AMD Radeon Pro

Opened this issue · 3 comments

Steven on a system with an AMD Radeon Pro 580 8192 MB video card (on macOS 10.14 Mojave, but doesn't occur on a different machine with the same OS but an Intel card) is able to build, run and connect to a server, but the video is severely mangled:

screen shot 2018-09-29 at 2 04 34 pm

You can almost make out the world, and it changes as you move (no mouse bug like on Linux #73 on mac since it doesn't use Wayland), but is not entirely usable. Commenting out most of the rendering except the sky reveals this striped pattern:

screen shot 2018-09-29 at 4 05 35 pm

in full screen it is completely striped, each band is 16 pixels:

screen shot 2018-09-29 at 4 06 01 pm

where the light blue is the sky color, which is expected to cover the full screen. The dark blue is solid 0,0,255.

The problem appears to be related to sampling. If I make this change:

diff --git a/src/render/mod.rs b/src/render/mod.rs
index 9644d5d..1eb9b5e 100644
--- a/src/render/mod.rs
+++ b/src/render/mod.rs
@@ -42,7 +42,7 @@ use std::sync::mpsc;
 const ATLAS_SIZE: usize = 1024;
 
 // TEMP
-const NUM_SAMPLES: i32 = 1;
+const NUM_SAMPLES: i32 = 2;
 
 pub struct Camera {
     pub pos: cgmath::Point3<f64>,

then it renders correctly! But I don't completely understand why, or if there is a better fix. Any ideas.

Setting NUM_SAMPLES to 2 causes the game to always render solid black (except on the title screen) on Ubuntu Linux 18.04.1, I think a real fix may need to be more complicated. Should it be set (to 1 or 2?) based on some property of the graphic cards?

Here's what I have so far, in addition to setting NUM_SAMPLES to 2 to fix/workaround the video corruption on the Radeon Pro 580, in image_2d_sample clamp the samples to GL_MAX_SAMPLES (on VMware/Ubuntu, it is 1) - this allows Steven to render the title screen and terrain on both systems:

diff --git a/src/gl/mod.rs b/src/gl/mod.rs
index 14a9206..b2e95d9 100644
--- a/src/gl/mod.rs
+++ b/src/gl/mod.rs
@@ -370,8 +370,18 @@ impl Texture {
                     format: TextureFormat,
                     fixed: bool) {
         unsafe {
+            let result: &mut [i32] = &mut [0; 1];
+            gl::GetIntegerv(gl::MAX_SAMPLES, &mut result[0]);
+            let use_samples =
+                if samples > result[0] {
+                    println!("glTexImage2DMultisample: requested {} samples but GL_MAX_SAMPLES is {}", samples, result[0]);
+                    result[0]
+                } else {
+                    samples
+                };
+
             gl::TexImage2DMultisample(target,
-                           samples,
+                           use_samples,
                            format,
                            width as i32,
                            height as i32,

but I'm not unconvinced there could be a better solution. Open questions:

  • What exactly is causing the corrupted/striped video pattern with NUM_SAMPLES=1? Missing image data? Can the rendering be corrected without increasing the sampling count?
  • If NUM_SAMPLES=1 is multisampling really needed? Possible/worthwhile to switch from glTexImage2DMultisample() to glTexImage2D()?

Could this be related to Thinkofname/steven-go#39?