
On 20/01/15 08:26, Sven Panne wrote:
other calls is bound to cause chaos. https://www.opengl.org/sdk/docs/man/html/glCreateShader.xhtml This is not a bug, this is intentional. "Shader" is an instance of "ObjectName", so you can check via "isObjectName" if you actually got a shader back. Another possibility is to retrieve the value of the "errors" state variable. In general, it's a good idea during development to check "errors" from time to time at a few strategic
glCreateShader returns 0 on error, and using that "not a shader" id for places (or use "reportErrors" from the GLUT package). Almost all API calls can fail in one way or the other, and doing error checking in the binding layer would lead to horrible performance. This is totally in the "OpenGL spirit" where basically no API call returns a success/failure indicator (apart from the global error state).
Ok, that makes sense. But for the second bug, the only way to tell if the glGetShader call failed and left memory unchanged is by checking the GL errors within the binding. I doubt shaderVar will be used in an inner loop so the performance hit should be acceptable.
The second bug is much more serious, a lack of error checking in shaderVar:
---8<--- shaderVar :: (GLint -> a) -> GetShaderPName -> Shader -> GettableStateVar a shaderVar f p shader = makeGettableStateVar $ alloca $ \buf -> do
repeatedly call glGetError here until it returns 0, to reset GL flags https://www.opengl.org/sdk/docs/man/html/glGetError.xhtml
glGetShaderiv (shaderID shader) (marshalGetShaderPName p) buf
call glGetError here, if it isn't 0 then throw an exception to avoid reading uninitialized memory
peek1 f buf ---8<--- http://hackage.haskell.org/package/OpenGL-2.10.0.0/docs/src/Graphics-Renderi...
glGetShaderiv doesn't modifiy the contents of buf on error. This means uninitialized memory is read by peek1 and then presumably used by f, which can cause a crash (in the best case) or wrong/undefined behaviour (in the worst case). https://www.opengl.org/sdk/docs/man/html/glGetShader.xhtml
Again, you should better make sure on the application level that the "Shader" you pass into the shader queries is actually a shader name. Otherwise you get nonsensical values for the shader state you queried or even a Haskell "error" in the case of "shaderType".
Just out of curiosity: What has been your expectation regarding error handling/detection? Using exceptions would be horrible, because as an application programmer you would have a very hard time dealing with them without causing resource leaks and/or inconsistent program state. Another option would be wrapping almost all stuff in the OpenGL binding into Maybe/Either, but I don't think anybody would welcome that.
I think throwing catchable IO exceptions in the very rare (exceptional!) occasions where the alternative is undefined behaviour (including crashes) would be much nicer than wrapping all the results. Claude