
On Mon, Mar 11, 2013 at 7:17 AM, Hollister Herhold
Okay, I think I just figured this out. Well, HOW to get it working with the accelerated renderer.
I was wondering a simple way to check renderer info so I ran glxinfo. This (automatically) fired up X11, and then on a hunch I re-ran Test2 with X11 running and got this:
hhmacbook:~/Development/haskell/OpenGL:57> ./Test2 hardware (2,7,7) (3,2,0) hhmacbook:~/Development/haskell/OpenGL:58>
AH HA! I then quit X11 and re-ran Test2, and got this:
hhmacbook:~/Development/haskell/OpenGL:58> ./Test2 software (2,7,7) (3,2,0) hhmacbook:~/Development/haskell/OpenGL:59>
SO- If you want the accelerated renderer, you need to have X11 running.
Now, I have no idea WHY this is the case, but there you go.
Hope this helps.
This lead me down an interesting path. First, I should explain that my machine, like most newish Macs, has two graphics cards. In my case, a discrete Nvidia GeForce GT 330M, and an integrated Intel chip. The former is better, but the latter uses less power, and the system is supposed to switch between them automatically. I used gfxCardStatus [1] to show which card was in use. When I ran test2.c, the system briefly switched to the discrete card. However, when I ran Test2.hs, the system kept using the integrated chip the whole time. Presumably, the Intel chip lacks a hardware implementation of OpenGL 3.2, which causes the system to fall back to a software renderer. I then used gfxCardStatus to force the system to *always* use the discrete card and - boom! - this time Test2.hs received a hardware renderer! So it seems that the problem is a) Mac OS X-specific, or possibly specific to systems with multiple graphics cards, b) related to triggering the *switch* to the better graphics card. I don't yet understand why the C program triggers a switch, while the Haskell program does not, but I'll keep investigating. Thank you all very much for your help! -- Jesper Särnesjö http://jesper.sarnesjo.org/ [1] http://gfx.io