gl:vertex-attrib-pointer should take an integer as the last argument
fjl opened this issue · 4 comments
I'm following the great modern OpenGL tutorial at http://www.arcsynthesis.org/gltut/.
His words:
If you're wondering why it is (void*)48 and not just 48, that is because of some legacy API
cruft. The reason why the function name is glVertexAttrib“Pointer” is because the last
parameter is technically a pointer to client memory. Or at least, it could be in the past. So
we must explicitly cast the integer value 48 to a pointer type.
If I want to use an integer index into a vertex array from CL, I have to use (cffi:make-pointer 48)
, which calls malloc
on ECL. glVertexAttribPointer
works with integers, passing a pointer to an int will actually not
work because it uses the address directly.
The same probably holds for glVertexAttribIPointer
.
Since the binding is autogenerated, I'd need to special-case it to somehow not follow
the spec. What would be the best way to do that?
It also needs to accept a pointer, since not everyone has recent OpenGL versions, and if possible I'd like to keep support any old code using the old interpretation. Not sure how to efficiently specify that an argument is either a pointer or pointer-sized int for CFFI though.
A possible solution would be:
- change the CFFI definition so it accepts a pointer-sized integer
- define a wrapper that converts the argument to an integer using
cffi:pointer-address
if it satisfiescffi:pointerp
I can provide a patch that does this in a few days.
If you want to keep the %gl
API stable, we can define a custom CFFI type that accepts both lisp types
in its translate-to-foreign
method.
Both solutions will require changes to the binding generator, though.
Just found out that it's possible to define macroexpansion hooks for CFFI types. This means we can go with the second approach and have CFFI compile the pointerp
switch into the foreign function wrapper.
Thank you! (I was still working on my patch for this...)