Performance problem with Haskell/OpenGL/GLFW

Hi there! My first post here. :) I have a curious performance problem with a Haskell/OpenGL/GLFW program that I just can't figure out. The program's CPU usage is quite high in general, and increases with the number of pixels redrawn per frame. It uses OpenGL 3.2, does most of its work on the GPU, and should do a constant amount of work per frame on the CPU, so this makes little sense. I've created a smaller program (still 91 lines, though), which renders a single triangle, but still reproduces this problem. If I make the triangle smaller (by editing vertexData), CPU usage goes down, and if I make it bigger, CPU usage goes up. [1] I've also created a C program which does the same thing (as far as possible), but which does not exhibit this problem. CPU usage is far lower, and stays constant regardless of the number of pixels redrawns per frame. [2] The library functions I use (from OpenGLRaw-1.3.0.0 and GLFW-b-0.1.0.5) are only thin wrappers around their C counterparts, so I strongly believe the problem is in my program. Any thoughts? -- Jesper Särnesjö http://jesper.sarnesjo.org/ [1] https://gist.github.com/sarnesjo/5116084#file-test-hs [2] https://gist.github.com/sarnesjo/5116084#file-test-c

On Fri, Mar 8, 2013 at 11:25 PM, Jesper Särnesjö
The program's CPU usage is quite high in general, and increases with the number of pixels redrawn per frame. It uses OpenGL 3.2, does most of its work on the GPU, and should do a constant amount of work per frame on the CPU, so this makes little sense.
I've created a smaller program (still 91 lines, though), which renders a single triangle, but still reproduces this problem. If I make the triangle smaller (by editing vertexData), CPU usage goes down, and if I make it bigger, CPU usage goes up. [1]
I've also created a C program which does the same thing (as far as possible), but which does not exhibit this problem. CPU usage is far lower, and stays constant regardless of the number of pixels redrawns per frame. [2]
I've figured out what the problem is: the Haskell program is using a software implementation of OpenGL, so rendering *does* in fact happen on the CPU. This seems to happen because I request a renderer conforming to the OpenGL 3.2 Core profile, while running on Mac OS X 10.8.2 [3]. If I remove those lines from displayOptions, the program receives a hardware-accelerated renderer. Of course, then I can't use features from OpenGL 3.2, so that's hardly a solution. It would appear that there is in fact some relevant difference between the Haskell [1] and C [2] versions of my program, or possibly some way in which the bindings from the GLFW-b package are different from the C library. I've updated my programs to check for hardware acceleration and also print the GLFW and OpenGL versions. [4] -- Jesper Särnesjö http://jesper.sarnesjo.org/ [1] https://gist.github.com/sarnesjo/5116084#file-test-hs [2] https://gist.github.com/sarnesjo/5116084#file-test-c [3] http://developer.apple.com/library/mac/#documentation/GraphicsImaging/Concep... [4] https://gist.github.com/sarnesjo/5116084/revisions

On Sun, Mar 10, 2013 at 4:46 PM, Jesper Särnesjö
I've figured out what the problem is: the Haskell program is using a software implementation of OpenGL, so rendering *does* in fact happen on the CPU.
It would appear that there is in fact some relevant difference between the Haskell and C versions of my program, or possibly some way in which the bindings from the GLFW-b package are different from the C library.
To remove any possibility of the problem being in GLFW-b or OpenGLRaw, I created two new programs, one in Haskell [1] and one in C [2], that don't import or include anything related to OpenGL, and that simply create a context, check if it is hardware accelerated, and then exit. That is all. And still, the Haskell program receives a software renderer, while the C program receives a hardware one: $ ghc -O2 Test2.hs -lglfw -framework OpenGL -fforce-recomp && ./Test2 [1 of 1] Compiling Main ( Test2.hs, Test2.o ) Linking Test2 ... software (2,7,7) (3,2,0) $ gcc -O2 test2.c -lglfw -framework OpenGL && ./a.out hardware 2.7.7 3.2.0 I haven't had the chance to run these programs on any OS other than Mac OS X 10.8.2, so I don't know if this problem is Mac-specific. Still, it's really weird that the system would differentiate between Haskell and C programs in this way. If anyone has any ideas about what's going on here, I'd very much like to hear them. -- Jesper Särnesjö http://jesper.sarnesjo.org/ [1] https://gist.github.com/sarnesjo/5116084#file-test2-hs [2] https://gist.github.com/sarnesjo/5116084#file-test2-c

I'm building glfw now on 10.7.5 and I'll try your test code. I've been learning haskell (still very much a beginner) but I know OpenGL, so I'm very interested in how this turns out. -Hollister On Mar 10, 2013, at 4:38 AM, Jesper Särnesjö wrote:
On Sun, Mar 10, 2013 at 4:46 PM, Jesper Särnesjö
wrote: I've figured out what the problem is: the Haskell program is using a software implementation of OpenGL, so rendering *does* in fact happen on the CPU.
It would appear that there is in fact some relevant difference between the Haskell and C versions of my program, or possibly some way in which the bindings from the GLFW-b package are different from the C library.
To remove any possibility of the problem being in GLFW-b or OpenGLRaw, I created two new programs, one in Haskell [1] and one in C [2], that don't import or include anything related to OpenGL, and that simply create a context, check if it is hardware accelerated, and then exit. That is all. And still, the Haskell program receives a software renderer, while the C program receives a hardware one:
$ ghc -O2 Test2.hs -lglfw -framework OpenGL -fforce-recomp && ./Test2 [1 of 1] Compiling Main ( Test2.hs, Test2.o ) Linking Test2 ... software (2,7,7) (3,2,0) $ gcc -O2 test2.c -lglfw -framework OpenGL && ./a.out hardware 2.7.7 3.2.0
I haven't had the chance to run these programs on any OS other than Mac OS X 10.8.2, so I don't know if this problem is Mac-specific. Still, it's really weird that the system would differentiate between Haskell and C programs in this way.
If anyone has any ideas about what's going on here, I'd very much like to hear them.
-- Jesper Särnesjö http://jesper.sarnesjo.org/
[1] https://gist.github.com/sarnesjo/5116084#file-test2-hs [2] https://gist.github.com/sarnesjo/5116084#file-test2-c
_______________________________________________ Beginners mailing list Beginners@haskell.org http://www.haskell.org/mailman/listinfo/beginners

OK, I get the same results as you. I ran a dtruss on the two different apps to look at the system calls being made and I can see where the C code opens the OpenGL hardware driver and the haskell code does not, but I'm not sure why. There are a lot of preferences files flying around. Still digging. I do know that both apps are using the same glfw library. On Mar 10, 2013, at 8:23 AM, Hollister Herhold wrote:
I'm building glfw now on 10.7.5 and I'll try your test code.
I've been learning haskell (still very much a beginner) but I know OpenGL, so I'm very interested in how this turns out.
-Hollister
On Mar 10, 2013, at 4:38 AM, Jesper Särnesjö wrote:
On Sun, Mar 10, 2013 at 4:46 PM, Jesper Särnesjö
wrote: I've figured out what the problem is: the Haskell program is using a software implementation of OpenGL, so rendering *does* in fact happen on the CPU.
It would appear that there is in fact some relevant difference between the Haskell and C versions of my program, or possibly some way in which the bindings from the GLFW-b package are different from the C library.
To remove any possibility of the problem being in GLFW-b or OpenGLRaw, I created two new programs, one in Haskell [1] and one in C [2], that don't import or include anything related to OpenGL, and that simply create a context, check if it is hardware accelerated, and then exit. That is all. And still, the Haskell program receives a software renderer, while the C program receives a hardware one:
$ ghc -O2 Test2.hs -lglfw -framework OpenGL -fforce-recomp && ./Test2 [1 of 1] Compiling Main ( Test2.hs, Test2.o ) Linking Test2 ... software (2,7,7) (3,2,0) $ gcc -O2 test2.c -lglfw -framework OpenGL && ./a.out hardware 2.7.7 3.2.0
I haven't had the chance to run these programs on any OS other than Mac OS X 10.8.2, so I don't know if this problem is Mac-specific. Still, it's really weird that the system would differentiate between Haskell and C programs in this way.
If anyone has any ideas about what's going on here, I'd very much like to hear them.
-- Jesper Särnesjö http://jesper.sarnesjo.org/
[1] https://gist.github.com/sarnesjo/5116084#file-test2-hs [2] https://gist.github.com/sarnesjo/5116084#file-test2-c
_______________________________________________ Beginners mailing list Beginners@haskell.org http://www.haskell.org/mailman/listinfo/beginners
_______________________________________________ Beginners mailing list Beginners@haskell.org http://www.haskell.org/mailman/listinfo/beginners

Hi.
AFAIK glfw-b uses its own version of glfw which is built during setup.
There is a makefile inside the package.
Can't reproduce this error on Arch.
2013/3/10 Hollister Herhold
OK, I get the same results as you. I ran a dtruss on the two different apps to look at the system calls being made and I can see where the C code opens the OpenGL hardware driver and the haskell code does not, but I'm not sure why. There are a lot of preferences files flying around. Still digging.
I do know that both apps are using the same glfw library.
On Mar 10, 2013, at 8:23 AM, Hollister Herhold wrote:
I'm building glfw now on 10.7.5 and I'll try your test code.
I've been learning haskell (still very much a beginner) but I know
OpenGL, so I'm very interested in how this turns out.
-Hollister
On Mar 10, 2013, at 4:38 AM, Jesper Särnesjö wrote:
On Sun, Mar 10, 2013 at 4:46 PM, Jesper Särnesjö
wrote:
I've figured out what the problem is: the Haskell program is using a software implementation of OpenGL, so rendering *does* in fact happen on the CPU.
It would appear that there is in fact some relevant difference between the Haskell and C versions of my program, or possibly some way in which the bindings from the GLFW-b package are different from the C library.
To remove any possibility of the problem being in GLFW-b or OpenGLRaw, I created two new programs, one in Haskell [1] and one in C [2], that don't import or include anything related to OpenGL, and that simply create a context, check if it is hardware accelerated, and then exit. That is all. And still, the Haskell program receives a software renderer, while the C program receives a hardware one:
$ ghc -O2 Test2.hs -lglfw -framework OpenGL -fforce-recomp && ./Test2 [1 of 1] Compiling Main ( Test2.hs, Test2.o ) Linking Test2 ... software (2,7,7) (3,2,0) $ gcc -O2 test2.c -lglfw -framework OpenGL && ./a.out hardware 2.7.7 3.2.0
I haven't had the chance to run these programs on any OS other than Mac OS X 10.8.2, so I don't know if this problem is Mac-specific. Still, it's really weird that the system would differentiate between Haskell and C programs in this way.
If anyone has any ideas about what's going on here, I'd very much like to hear them.
-- Jesper Särnesjö http://jesper.sarnesjo.org/
[1] https://gist.github.com/sarnesjo/5116084#file-test2-hs [2] https://gist.github.com/sarnesjo/5116084#file-test2-c
_______________________________________________ Beginners mailing list Beginners@haskell.org http://www.haskell.org/mailman/listinfo/beginners
_______________________________________________ Beginners mailing list Beginners@haskell.org http://www.haskell.org/mailman/listinfo/beginners
_______________________________________________ Beginners mailing list Beginners@haskell.org http://www.haskell.org/mailman/listinfo/beginners

I don't think so - I had to build and install glfw from source to get the Haskell code to link after building glfw-b (which built fine without GLFW installed, incidentally). I get the following for library dependencies - it looks like they're using the exact same libs (aside from iconv): hhmacbook:~/Development/haskell/OpenGL:6> otool -L ./a.out ./a.out: @executable_path/libglfw.dylib (compatibility version 1.0.0, current version 1.0.0) /System/Library/Frameworks/OpenGL.framework/Versions/A/OpenGL (compatibility version 1.0.0, current version 1.0.0) /usr/lib/libSystem.B.dylib (compatibility version 1.0.0, current version 159.1.0) hhmacbook:~/Development/haskell/OpenGL:7> otool -L ./Test2 ./Test2: @executable_path/libglfw.dylib (compatibility version 1.0.0, current version 1.0.0) /System/Library/Frameworks/OpenGL.framework/Versions/A/OpenGL (compatibility version 1.0.0, current version 1.0.0) /usr/lib/libiconv.2.dylib (compatibility version 7.0.0, current version 7.0.0) /usr/lib/libSystem.B.dylib (compatibility version 1.0.0, current version 159.1.0) hhmacbook:~/Development/haskell/OpenGL:8> On Mar 10, 2013, at 9:31 AM, Andrey Yankin wrote:
Hi.
AFAIK glfw-b uses its own version of glfw which is built during setup. There is a makefile inside the package.
Can't reproduce this error on Arch.
2013/3/10 Hollister Herhold
OK, I get the same results as you. I ran a dtruss on the two different apps to look at the system calls being made and I can see where the C code opens the OpenGL hardware driver and the haskell code does not, but I'm not sure why. There are a lot of preferences files flying around. Still digging.
I do know that both apps are using the same glfw library.
On Mar 10, 2013, at 8:23 AM, Hollister Herhold wrote:
I'm building glfw now on 10.7.5 and I'll try your test code.
I've been learning haskell (still very much a beginner) but I know OpenGL, so I'm very interested in how this turns out.
-Hollister
On Mar 10, 2013, at 4:38 AM, Jesper Särnesjö wrote:
On Sun, Mar 10, 2013 at 4:46 PM, Jesper Särnesjö
wrote: I've figured out what the problem is: the Haskell program is using a software implementation of OpenGL, so rendering *does* in fact happen on the CPU.
It would appear that there is in fact some relevant difference between the Haskell and C versions of my program, or possibly some way in which the bindings from the GLFW-b package are different from the C library.
To remove any possibility of the problem being in GLFW-b or OpenGLRaw, I created two new programs, one in Haskell [1] and one in C [2], that don't import or include anything related to OpenGL, and that simply create a context, check if it is hardware accelerated, and then exit. That is all. And still, the Haskell program receives a software renderer, while the C program receives a hardware one:
$ ghc -O2 Test2.hs -lglfw -framework OpenGL -fforce-recomp && ./Test2 [1 of 1] Compiling Main ( Test2.hs, Test2.o ) Linking Test2 ... software (2,7,7) (3,2,0) $ gcc -O2 test2.c -lglfw -framework OpenGL && ./a.out hardware 2.7.7 3.2.0
I haven't had the chance to run these programs on any OS other than Mac OS X 10.8.2, so I don't know if this problem is Mac-specific. Still, it's really weird that the system would differentiate between Haskell and C programs in this way.
If anyone has any ideas about what's going on here, I'd very much like to hear them.
-- Jesper Särnesjö http://jesper.sarnesjo.org/
[1] https://gist.github.com/sarnesjo/5116084#file-test2-hs [2] https://gist.github.com/sarnesjo/5116084#file-test2-c
_______________________________________________ Beginners mailing list Beginners@haskell.org http://www.haskell.org/mailman/listinfo/beginners
_______________________________________________ Beginners mailing list Beginners@haskell.org http://www.haskell.org/mailman/listinfo/beginners
_______________________________________________ Beginners mailing list Beginners@haskell.org http://www.haskell.org/mailman/listinfo/beginners
_______________________________________________ Beginners mailing list Beginners@haskell.org http://www.haskell.org/mailman/listinfo/beginners

On Mon, Mar 11, 2013 at 1:31 AM, Andrey Yankin
AFAIK glfw-b uses its own version of glfw which is built during setup. There is a makefile inside the package.
GLFW-b does indeed bundle its own version of the GLFW C library [1], version 2.7.3 as of right now. This is why it can be installed without first installing the C library on your system. However, this is not the library that Test2.hs (nor test2.c) links against. In fact, that program does not use GLFW-b at all, but rather simply declares a few entry points with C calling convention that should be there at linking time. I did this only to be able to rule out any problem with GLFW-b.
Can't reproduce this error on Arch.
I got the change to run my code on a Windows 7 machine, and didn't see the problem there either. This seems to be specific to Haskell on Mac OS X. -- Jesper Särnesjö http://jesper.sarnesjo.org/ [1] https://github.com/bsl/GLFW-b/tree/master/glfw

On Sun, Mar 10, 2013 at 5:38 AM, Jesper Särnesjö
To remove any possibility of the problem being in GLFW-b or OpenGLRaw, I created two new programs, one in Haskell [1] and one in C [2], that don't import or include anything related to OpenGL, and that simply create a context, check if it is hardware accelerated, and then exit. That is all. And still, the Haskell program receives a software renderer, while the C program receives a hardware one:
Check whether the Haskell OpenGL binding is using the X11 libraries instead of native GUI. My (possibly incorrect) understanding is that XQuartz does not yet support OpenGL 3.x directly, so it falls back to software rendering. -- brandon s allbery kf8nh sine nomine associates allbery.b@gmail.com ballbery@sinenomine.net unix, openafs, kerberos, infrastructure, xmonad http://sinenomine.net

Digging... Is that a configurable thing? On Mar 10, 2013, at 12:09 PM, Brandon Allbery wrote:
On Sun, Mar 10, 2013 at 5:38 AM, Jesper Särnesjö
wrote: To remove any possibility of the problem being in GLFW-b or OpenGLRaw, I created two new programs, one in Haskell [1] and one in C [2], that don't import or include anything related to OpenGL, and that simply create a context, check if it is hardware accelerated, and then exit. That is all. And still, the Haskell program receives a software renderer, while the C program receives a hardware one: Check whether the Haskell OpenGL binding is using the X11 libraries instead of native GUI. My (possibly incorrect) understanding is that XQuartz does not yet support OpenGL 3.x directly, so it falls back to software rendering.
-- brandon s allbery kf8nh sine nomine associates allbery.b@gmail.com ballbery@sinenomine.net unix, openafs, kerberos, infrastructure, xmonad http://sinenomine.net _______________________________________________ Beginners mailing list Beginners@haskell.org http://www.haskell.org/mailman/listinfo/beginners

On Sun, Mar 10, 2013 at 1:27 PM, Hollister Herhold
On Mar 10, 2013, at 12:09 PM, Brandon Allbery wrote:
On Sun, Mar 10, 2013 at 5:38 AM, Jesper Särnesjö
wrote: To remove any possibility of the problem being in GLFW-b or OpenGLRaw, I created two new programs, one in Haskell [1] and one in C [2], that
Check whether the Haskell OpenGL binding is using the X11 libraries instead of native GUI. My
Digging... Is that a configurable thing?
I just checked and it looks like it's at least trying to use the native framework. -- brandon s allbery kf8nh sine nomine associates allbery.b@gmail.com ballbery@sinenomine.net unix, openafs, kerberos, infrastructure, xmonad http://sinenomine.net

That's what I thought from looking at the output from dtruss. Is this a GLFW-only issue? It seems unlikely that Haskell OpenGL apps in general have been using the software renderer. On Mar 10, 2013, at 1:47 PM, Brandon Allbery wrote:
On Sun, Mar 10, 2013 at 1:27 PM, Hollister Herhold
wrote: On Mar 10, 2013, at 12:09 PM, Brandon Allbery wrote: On Sun, Mar 10, 2013 at 5:38 AM, Jesper Särnesjö
wrote: To remove any possibility of the problem being in GLFW-b or OpenGLRaw, I created two new programs, one in Haskell [1] and one in C [2], that Check whether the Haskell OpenGL binding is using the X11 libraries instead of native GUI. My Digging... Is that a configurable thing?
I just checked and it looks like it's at least trying to use the native framework.
-- brandon s allbery kf8nh sine nomine associates allbery.b@gmail.com ballbery@sinenomine.net unix, openafs, kerberos, infrastructure, xmonad http://sinenomine.net _______________________________________________ Beginners mailing list Beginners@haskell.org http://www.haskell.org/mailman/listinfo/beginners

Okay, I think I just figured this out. Well, HOW to get it working with the accelerated renderer. I was wondering a simple way to check renderer info so I ran glxinfo. This (automatically) fired up X11, and then on a hunch I re-ran Test2 with X11 running and got this: hhmacbook:~/Development/haskell/OpenGL:57> ./Test2 hardware (2,7,7) (3,2,0) hhmacbook:~/Development/haskell/OpenGL:58> AH HA! I then quit X11 and re-ran Test2, and got this: hhmacbook:~/Development/haskell/OpenGL:58> ./Test2 software (2,7,7) (3,2,0) hhmacbook:~/Development/haskell/OpenGL:59> SO- If you want the accelerated renderer, you need to have X11 running. Now, I have no idea WHY this is the case, but there you go. Hope this helps. -Hollister On Mar 10, 2013, at 3:10 PM, Hollister Herhold wrote:
That's what I thought from looking at the output from dtruss.
Is this a GLFW-only issue? It seems unlikely that Haskell OpenGL apps in general have been using the software renderer.
On Mar 10, 2013, at 1:47 PM, Brandon Allbery wrote:
On Sun, Mar 10, 2013 at 1:27 PM, Hollister Herhold
wrote: On Mar 10, 2013, at 12:09 PM, Brandon Allbery wrote: On Sun, Mar 10, 2013 at 5:38 AM, Jesper Särnesjö
wrote: To remove any possibility of the problem being in GLFW-b or OpenGLRaw, I created two new programs, one in Haskell [1] and one in C [2], that Check whether the Haskell OpenGL binding is using the X11 libraries instead of native GUI. My Digging... Is that a configurable thing?
I just checked and it looks like it's at least trying to use the native framework.
-- brandon s allbery kf8nh sine nomine associates allbery.b@gmail.com ballbery@sinenomine.net unix, openafs, kerberos, infrastructure, xmonad http://sinenomine.net _______________________________________________ Beginners mailing list Beginners@haskell.org http://www.haskell.org/mailman/listinfo/beginners
_______________________________________________ Beginners mailing list Beginners@haskell.org http://www.haskell.org/mailman/listinfo/beginners

On Mon, Mar 11, 2013 at 7:17 AM, Hollister Herhold
Okay, I think I just figured this out. Well, HOW to get it working with the accelerated renderer.
I was wondering a simple way to check renderer info so I ran glxinfo. This (automatically) fired up X11, and then on a hunch I re-ran Test2 with X11 running and got this:
hhmacbook:~/Development/haskell/OpenGL:57> ./Test2 hardware (2,7,7) (3,2,0) hhmacbook:~/Development/haskell/OpenGL:58>
AH HA! I then quit X11 and re-ran Test2, and got this:
hhmacbook:~/Development/haskell/OpenGL:58> ./Test2 software (2,7,7) (3,2,0) hhmacbook:~/Development/haskell/OpenGL:59>
SO- If you want the accelerated renderer, you need to have X11 running.
Now, I have no idea WHY this is the case, but there you go.
Hope this helps.
This lead me down an interesting path. First, I should explain that my machine, like most newish Macs, has two graphics cards. In my case, a discrete Nvidia GeForce GT 330M, and an integrated Intel chip. The former is better, but the latter uses less power, and the system is supposed to switch between them automatically. I used gfxCardStatus [1] to show which card was in use. When I ran test2.c, the system briefly switched to the discrete card. However, when I ran Test2.hs, the system kept using the integrated chip the whole time. Presumably, the Intel chip lacks a hardware implementation of OpenGL 3.2, which causes the system to fall back to a software renderer. I then used gfxCardStatus to force the system to *always* use the discrete card and - boom! - this time Test2.hs received a hardware renderer! So it seems that the problem is a) Mac OS X-specific, or possibly specific to systems with multiple graphics cards, b) related to triggering the *switch* to the better graphics card. I don't yet understand why the C program triggers a switch, while the Haskell program does not, but I'll keep investigating. Thank you all very much for your help! -- Jesper Särnesjö http://jesper.sarnesjo.org/ [1] http://gfx.io

I guess running X11 forces use of the NVidia chip. Interesting.
-Hollister
On Mar 10, 2013, at 5:23 PM, Jesper Särnesjö
On Mon, Mar 11, 2013 at 7:17 AM, Hollister Herhold
wrote: Okay, I think I just figured this out. Well, HOW to get it working with the accelerated renderer.
I was wondering a simple way to check renderer info so I ran glxinfo. This (automatically) fired up X11, and then on a hunch I re-ran Test2 with X11 running and got this:
hhmacbook:~/Development/haskell/OpenGL:57> ./Test2 hardware (2,7,7) (3,2,0) hhmacbook:~/Development/haskell/OpenGL:58>
AH HA! I then quit X11 and re-ran Test2, and got this:
hhmacbook:~/Development/haskell/OpenGL:58> ./Test2 software (2,7,7) (3,2,0) hhmacbook:~/Development/haskell/OpenGL:59>
SO- If you want the accelerated renderer, you need to have X11 running.
Now, I have no idea WHY this is the case, but there you go.
Hope this helps.
This lead me down an interesting path.
First, I should explain that my machine, like most newish Macs, has two graphics cards. In my case, a discrete Nvidia GeForce GT 330M, and an integrated Intel chip. The former is better, but the latter uses less power, and the system is supposed to switch between them automatically.
I used gfxCardStatus [1] to show which card was in use. When I ran test2.c, the system briefly switched to the discrete card. However, when I ran Test2.hs, the system kept using the integrated chip the whole time. Presumably, the Intel chip lacks a hardware implementation of OpenGL 3.2, which causes the system to fall back to a software renderer. I then used gfxCardStatus to force the system to *always* use the discrete card and - boom! - this time Test2.hs received a hardware renderer!
So it seems that the problem is a) Mac OS X-specific, or possibly specific to systems with multiple graphics cards, b) related to triggering the *switch* to the better graphics card. I don't yet understand why the C program triggers a switch, while the Haskell program does not, but I'll keep investigating.
Thank you all very much for your help!
-- Jesper Särnesjö http://jesper.sarnesjo.org/
[1] http://gfx.io
_______________________________________________ Beginners mailing list Beginners@haskell.org http://www.haskell.org/mailman/listinfo/beginners

On Sun, Mar 10, 2013 at 6:27 PM, Hollister Herhold
I guess running X11 forces use of the NVidia chip. Interesting.
Yes; you can see some discussion about it on Apple's X11-Users list, if you care. -- brandon s allbery kf8nh sine nomine associates allbery.b@gmail.com ballbery@sinenomine.net unix, openafs, kerberos, infrastructure, xmonad http://sinenomine.net

On Mon, Mar 11, 2013 at 8:23 AM, Jesper Särnesjö
I used gfxCardStatus to show which card was in use. When I ran test2.c, the system briefly switched to the discrete card. However, when I ran Test2.hs, the system kept using the integrated chip the whole time. Presumably, the Intel chip lacks a hardware implementation of OpenGL 3.2, which causes the system to fall back to a software renderer. I then used gfxCardStatus to force the system to *always* use the discrete card and - boom! - this time Test2.hs received a hardware renderer!
So it seems that the problem is a) Mac OS X-specific, or possibly specific to systems with multiple graphics cards, b) related to triggering the *switch* to the better graphics card. I don't yet understand why the C program triggers a switch, while the Haskell program does not, but I'll keep investigating.
I haven't had much time to look at this, unfortunately, but I did notice one interesting thing: the Haskell program *does* in fact trigger a switch of graphics cards, just... not as quickly. To see this, you can check the system console (using Console.app). Here is what a switch from the integrated card to the discrete one looks like on my machine: 3/13/13 12:02:22.486 AM WindowServer[77]: Received display connect changed for display 0x4272dc0 3/13/13 12:02:22.548 AM WindowServer[77]: Received display connect changed for display 0x3f003d 3/13/13 12:02:22.549 AM WindowServer[77]: CGXMuxAcknowledge: Posting glitchless acknowledge 3/13/13 12:02:22.593 AM WindowServer[77]: Received display connect changed for display 0x4272dc0 When I run the C program, this get logged immediately following the execution of glfwOpenWindow (I stepped through the program using GDB). For the Haskell program, well... If I run the program normally, the above gets logged with roughly a second's delay, and the program receives a software renderer. However, if I step through it using GHCi, it gets logged immediately following the execution of glfwOpenWindow - and the program receives a hardware renderer! Shot in the dark here, but could this be due to lazy I/O? I seem to recall reading something about GHCi forcing stricter I/O. -- Jesper Särnesjö http://jesper.sarnesjo.org/

Now that I have a better idea of what the underlying problem is, I am going to rephrase and repost this on haskell-cafe. Hollister, Andrey and Brandon, I really appreciate you helping me narrow it down. -- Jesper Särnesjö http://jesper.sarnesjo.org/

I'm very interested to hear how it turns out - please let us know if you learn anything new. -Hollister On Mar 13, 2013, at 9:32 AM, Jesper Särnesjö wrote:
Now that I have a better idea of what the underlying problem is, I am going to rephrase and repost this on haskell-cafe. Hollister, Andrey and Brandon, I really appreciate you helping me narrow it down.
-- Jesper Särnesjö http://jesper.sarnesjo.org/
_______________________________________________ Beginners mailing list Beginners@haskell.org http://www.haskell.org/mailman/listinfo/beginners
participants (4)
-
Andrey Yankin
-
Brandon Allbery
-
Hollister Herhold
-
Jesper Särnesjö