Texture::new fails as GL_MAX_RENDERBUFFER_SIZE is 0
nickeb96 opened this issue · 14 comments
I'm having some difficulty getting the example code to run. The first issue was simple to fix. The ? operator needs to be removed from some function calls:
error[E0277]: the `?` operator can only be applied to values that implement `std::ops::Try`
--> src/main.rs:22:9
|
22 | ctx.clear_color(&mut surface, (0.4, 0.4, 0.8, 1.0))?;
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ the `?` operator cannot be applied to type `()`
|
= help: the trait `std::ops::Try` is not implemented for `()`
= note: required by `std::ops::Try::into_result`
error[E0277]: the `?` operator can only be applied to values that implement `std::ops::Try`
--> src/main.rs:23:9
|
23 | ctx.draw(&mut surface, &texture, (100, 150), &DrawConfig::default())?;
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ the `?` operator cannot be applied to type `()`
|
= help: the trait `std::ops::Try` is not implemented for `()`
= note: required by `std::ops::Try::into_result`
The second issue is a little bit harder to fix. When I run it with the example texture given I get Error: InvalidTextureSize { width: 32, height: 32 }
. This comes from the call to a let texture = Texture::load(&mut ctx, "./textures/player.png")?;
I'm not sure why these dimensions would be wrong, but from looking at the code I think it has to do with Backend::constants
's max_texture_size
field being wrong. I'm not sure how this is setup, however.
🤦♂️ I did not update the readme
InvalidTextureSize seems like a bigger problem though.
What happens when you run the following code?
use crow::{Context, glutin::WindowBuilder, Texture};
fn main() -> Result<(), crow::Error> {
let mut ctx = Context::new(WindowBuilder::new())?;
println!("maximum supported texture size: {:?}", ctx.maximum_texture_size());
Ok(())
}
I get the following: maximum supported texture size: (0, 0)
I'm not entirely sure why this is happening now. It was working yesterday when I used this repo as a dependency instead of through crates.io.
I'm not entirely sure why this is happening now. It was working yesterday when I used this repo as a dependency instead of through crates.io.
I ended up checking every OpenGL call if requires a version > 3.3 and also checked all possible errors. There were some functions that could have failed unexpectedly, which I now actually check.
glRenderbufferStorage,which is used to create the depth buffer when drawing to textures, requires the requested dimensions to be less than GL_MAX_RENDERBUFFER_SIZE
.
As I expected GL_MAX_TEXTURE_SIZE
and GL_MAX_RENDERBUFFER_SIZE
to be decently close, I use the minimum of both to calculate ctx.maximum_texture_size()
. This assumptions is obviously wrong on some systems.
The actual problem was attachment completeness and framebuffer size, not the renderbuffer size.
The image has a non-zero width and height (the height of a 1D image is assumed to be 1). The width/height must also be less than GL_MAX_FRAMEBUFFER_WIDTH and GL_MAX_FRAMEBUFFER_HEIGHT respectively (if GL 4.3/ARB_framebuffer_no_attachments).
I see. I would think that GL_MAX_RENDERBUFFER_SIZE
would still be greater than 0 if it's needed to make the depth buffer though?
That's what annoys me... it seems like your OpenGL implementation does not support renderbuffers,
which is weird.
I added a branch for the last commit without this check
crow = { git = "https://github.com/lcnr/crow", branch = "no_renderbuffer_size" }
Would you mind testing if that commit works on your machine
git clone https://github.com/lcnr/crow.git
cd crow
git checkout no_renderbuffer_size
cargo test
I just ran that and here's the output:
Finished test [unoptimized + debuginfo] target(s) in 0.27s
Running target/debug/deps/crow-68b6743ea985742f
running 0 tests
test result: ok. 0 passed; 0 failed; 0 ignored; 0 measured; 0 filtered out
Running target/debug/deps/test-aa8485f00649d6af
running 12 tests
TEST FAILED (invalid return image): lines_offset
TEST FAILED (invalid return image): debug_lines
test result: FAILED. 10 passed; 2 failed; 0 ignored; 0 measured; 0 filtered out
error: test failed, to rerun pass '--test test'
If it's any help, I tried running the following and it looks like the max texture size is working again.
use crow::{Context, glutin::WindowBuilder, glutin::EventsLoop};
fn main() -> Result<(), crow::Error> {
let ctx = Context::new(WindowBuilder::new(), EventsLoop::new())?;
println!("maximum supported texture size: {:?}", ctx.maximum_texture_size());
Ok(())
}
output:
maximum supported texture size: (16384, 16384)
Also, the example code in the readme works again with this branch.
<bvgnmj,l.ö-
I am pretty damn sure your graphics driver has a bug...
If it's any help, I tried running the following and it looks like the max texture size is working again.
use crow::{Context, glutin::WindowBuilder, glutin::EventsLoop}; fn main() -> Result<(), crow::Error> { let ctx = Context::new(WindowBuilder::new(), EventsLoop::new())?; println!("maximum supported texture size: {:?}", ctx.maximum_texture_size()); Ok(()) }
output:
maximum supported texture size: (16384, 16384)
Also, the example code in the readme works again with this branch.
Can you try running the same using the branch fffffffffff
And here is the output of that code on the fffffffffff branch:
Finished dev [unoptimized + debuginfo] target(s) in 0.06s
Running `target/debug/crowgame`
[/Users/Nick/.cargo/git/checkouts/crow-e581a12d7c9d3522/6621b53/src/backend/mod.rs:46] get(gl::MAX_TEXTURE_SIZE, "max_texture_size") = 16384
[/Users/Nick/.cargo/git/checkouts/crow-e581a12d7c9d3522/6621b53/src/backend/mod.rs:47] get(gl::MAX_RENDERBUFFER_SIZE, "max_renderbuffer_size") = 16384
[/Users/Nick/.cargo/git/checkouts/crow-e581a12d7c9d3522/6621b53/src/backend/mod.rs:51] get(gl::MAX_FRAMEBUFFER_WIDTH, "max_framebuffer_width") = 0
[/Users/Nick/.cargo/git/checkouts/crow-e581a12d7c9d3522/6621b53/src/backend/mod.rs:53] get(gl::MAX_FRAMEBUFFER_HEIGHT, "max_framebuffer_height") = 0
[/Users/Nick/.cargo/git/checkouts/crow-e581a12d7c9d3522/6621b53/src/backend/mod.rs:55] GlConstants{max_texture_size:
(cmp::min(size, framebuffer_width),
cmp::min(size, framebuffer_height)),} = GlConstants {
max_texture_size: (
0,
0,
),
}
maximum supported texture size: (0, 0)
I can try it out on a different computer when I get home later. So far I've been doing it on a semi new macbook.
Nevermind:
The image has a non-zero width and height (the height of a 1D image is assumed to be 1). The width/height must also be less than GL_MAX_FRAMEBUFFER_WIDTH and GL_MAX_FRAMEBUFFER_HEIGHT respectively (if GL 4.3/ARB_framebuffer_no_attachments).
Afaik you have GL4.2 and I don't use the ARB_framebuffer_no_attachments
extension.
Will be fixed asap
Should now be fixed, please try it once again and tell me your output.
Thank you for the issue and the quick responses ❤️
I just gave it another go on the master branch and it all works perfectly. Thanks again for all of your help,