Lokathor/gl33

need docs about `GLboolean` (8-bit) vs `GL_TRUE`/`GL_FALSE` (32-bit)

Opened this issue · 0 comments

So the deal is that GL_TRUE and GL_FALSE are of type GLenum, which is a 32-bit value, and then separaetly sometimes you need to pass a GLboolean, which is an 8-bit value.

Since Rust doesn't let us just automatically coerce numbers around like in C, you actually can't use GL_TRUE or GL_FALSE in both positions. If we define them as u8 values then it won't match GLenum usage, and if we define them as 32-bit values it won't match GLboolean usage.

Also, we can't simply use bool rather than u8 because sometimes a GLboolean is an out param from GL, and my paranoia of GL possibly writing invalid bit patterns to a bool is too high.

  • For now: users should write true as _ or false as _ when they need a GLboolean, and then in a breaking future release we can probably have GLboolean be a proper newtype over u8 similar to VkBool in the vkvk crate.