
| > I am currently evaluating different languages for implementing an | > application which will have to manipulate large graphs representing | > the structure of programs and their evolution. | | > Speed is in fact a crucial criterium for the language choice. A wise man once warned about the danger of premature optimisation. I often spend ages labouring over efficiency aspects of my code (GHC, for example) that turn out to be nowhere near the critical path. Language choice is another example. My biased impression is that raw speed is seldom the determining factor in language choice. Language shootouts are fun and motivating, and give results that have the respectability that numbers confer. Other things being equal, one would always choose the fastest vehicle; but other things are *never* equal! On the contrary, other factors are much more important than speed: how quick it is to get code written, how much debugging is required once it is written, what libraries are available, what the programming environment is like, how easy it is to modify the program without giving rise to unforeseen bugs, etc etc. Sometimes performance is indeed central, but seldom. Usually the program is either 10x faster than necessary or 10x slower than necessary. In neither case does a 2x factor in language make much difference; in the former case you're happy either way, in the latter you need to fix your algorithm either way. For GHC in particular, there are many places where performance is less good than it could be, simply due to lack of time from Simon and me to tune the compiler. For example, there's been some correspondence recently about array-processing loops that's still on my to-do list. We're always looking for people to help improve GHC's performance, where the problem is simply lack of effort. Speak up, guys! That said, there are undoubtedly reasons why a high level language is fundamentally never going to be as fast as a low level one. (GHC beats C on nfib, but that's about all.) But I bet that this difference is only crucial in a small minority of applications. Simon

"Simon Peyton-Jones"
| > I am currently evaluating different languages for implementing an | > application which will have to manipulate large graphs representing | > the structure of programs and their evolution. | | > Speed is in fact a crucial criterium for the language choice.
My biased impression is that raw speed is seldom the determining factor in language choice.
Indeed. And I would add that, if the application's algorithms turn out to be sufficiently lazy, then using Haskell /could/ end up being /more/ space-efficient than alternative languages, which would be a big win if the data-structures being manipulated are very large. Regards, Malcolm

Simon Peyton-Jones wrote:
A wise man once warned about the danger of premature optimisation. I often spend ages labouring over efficiency aspects of my code (GHC, for example) that turn out to be nowhere near the critical path. Language choice is another example.
My biased impression is that raw speed is seldom the determining factor in language choice.
Etc. ...
That said, there are undoubtedly reasons why a high level language is fundamentally never going to be as fast as a low level one.
Well, ... What does it mean a "fast language"? My favourite example, some centuries old, goes back to times when I was a happy, young physicist. We wanted to implement a combinatorial problem, the global effect of individual nucleon-nucleon scattering in alpha-alpha collisions. Some 256 diigraphs to compute, much more for the final result. A friend of mine wrote a Pascal/Fortran program for a CDC Cyber mainframe, and I took microProlog and a Sinclair Spectrum with 48K of main storage. His program gave the result after 3 seconds. Mine: after, hm. ... well, after some hours I had to use a cassette player to store the intermediate results, finally next morning I got something usable. So what? - will you ask. Well, the crux of the matter is that I wrote my program in two days. My friend spent about two weeks to complete his coding. Now, who could drink more bottles of beer before obtaining the final results? =================== And now, again, more and more people curse the laziness, fight against boxed, shareable data, want to produce lightspeed database interfaces, etc. I am too lazy to get nervous (unless I do so for theatrical reasons, as an old teacher), but I sincerely think that it is time that somebody writes a book on Haskell as a language for the FAST DESIGN of lousy algorithms... Jerzy Karczmarczuk Caen, France

On Thu, 18 Mar 2004, Simon Peyton-Jones wrote:
Date: Thu, 18 Mar 2004 10:28:54 -0000 From: Simon Peyton-Jones
To: Ketil Malde , "[iso-8859-1] Sébastien Pierre" Cc: glasgow-haskell-users@haskell.org Subject: RE: Haskell performance | > I am currently evaluating different languages for implementing an | > application which will have to manipulate large graphs representing | > the structure of programs and their evolution. | | > Speed is in fact a crucial criterium for the language choice.
Not long ago, I threw out 15 000 lines of optimized C in favour of haskell just because of speed; implementing the algorithm I wanted would have been more or less impossible because of the overwhelming complexity. In C I would shoot my toes off by just looking at any memory use; in haskell I can concentrate on coding this much faster algorithm. if you want a 10 times speed up, you wait some time for the hardware technology to make up. if you want a 100 times speed up, you consider your algorithms. ---------------------------------------------------------- Johan 'Mahogny' 'Staalis' Henriksson, mahogny@areta.org Student at Chalmers University of Technology and Gothenburg University Lead Coder at Areta, www.areta.org Engineer and sysadmin at BRK-Tjaenst AB, www.brksweden.se
participants (4)
-
Jerzy Karczmarczuk
-
mahogny@sirius.areta.org
-
Malcolm Wallace
-
Simon Peyton-Jones