
Hello everyone, I'm making my first experiences with OpenGL at the moment. It's really nice to use, but I got one issue which really freaks me out: Let's say I want to create a square-grid across the whole screen as background of a very basic 2D game. The application is meant to run in full screen mode. Since I want it to work on different resolutions and screen formats I introduced an IORef relSize (of type (GLfloat,GLfloat)) which is meant to store the size of the squares relative to the current window size, and tried to set a reshape callback in the form of reshape relS s@(Size w h) = do viewport $= (Position 0 0, s) relS $= (size / w, size / h) where size is just a name for the (at another place) defined absolute size for the squares. The Problem is that i see absolutely no way to convert w and h (which are of type GLsizei) in order to yield a value of type GLfloat. Is there any way to extract a Haskell Integer value from values of type GLsizei and convert a Haskell Float value to a GLfloat? Or is there a more direct way to do what I'm trying to achieve? The only solution i've come up with so far is using read.show as converter, which cannot really be the intended way of doing this. Thank you very much! Sincerely, Achim Krause