RE: ANNOUNCE: GHC survey results

On 28 June 2005 14:11, Bulat Ziganshin wrote:
Tuesday, June 28, 2005, 1:58:13 PM, you wrote:
http://www.haskell.org/ghc/survey2005-summary.html There's a lot to take in, but it's an interesting read. Enjoy!
thank you all, processing all the 600 answers was not easy work :)
i have several comments regarding results of this survey:
1) GHCi compiles to bytecode several times faster than GHC makes unoptimized compilation. can unoptimized GHC compilation just create bytecode (as Ocaml does)?
Do you really mean "several times faster"? My impression is that it's a few percent faster, if at all. Some measurements would be good. We've thought about compiling to persistent byte code in the past, but the gains never seemed to be worth the effort: we'd need a byte-code serialiser and loader.
2) is your plans to support x86-64 platform includes Windows (and all other sorts of Unix), or it's for Linux only?
Definitely Windows too. I hear the cygwin/mingw tools are still unstable on Windows x64, though.
3) many users complaining about non-compatibility between GHC versions. if they mean library interfaces changes then how about using Pesco's library versioning scheme? (see http://www.haskell.org/tmrwiki/EternalCompatibilityInTheory)
I've read that article, and I think it's an interesting idea. I can't disagree with the arguments put forward, but somehow, the cure seems a bit painful. Cheers, Simon

Am 29. Jun 2005 um 11.03 Uhr schrieb Simon Marlow:
On 28 June 2005 14:11, Bulat Ziganshin wrote:
3) many users complaining about non-compatibility between GHC versions. if they mean library interfaces changes then how about using Pesco's library versioning scheme? (see http://www.haskell.org/tmrwiki/EternalCompatibilityInTheory)
I've read that article, and I think it's an interesting idea. I can't disagree with the arguments put forward, but somehow, the cure seems a bit painful.
Maybe a slightly painful cure would be preferable to a protracted disease? ;) Metaphors aside, I have specifically tried to minimize developer pain with ECT. Can you be specific as to where you fear it? In adoption? In maintenance? In the survey summary, you state that you "will strive to clearly indicate which features and libraries are considered experimental". While that means users of GHC can better avoid breakage, it obviously does so at the cost of prohibiting their use of those often very appealing new developments. Also, extra stress is put on the library developers to "get everything right" when a library is moved to stable status and, at the same time, to change as little as possible afterwards. With a growing number of users (as it seems to be currently happening), pressure to not change anything will likewise rise. The effect can already be seen in the Haskell 98 libraries, to some of which considerable improvements have become available. Still we cannot change them. H98 is not so bad, because the amount of libraries is relatively small and contained, but as can be seen in GHC 6.4, our library base is growing fast. People will want to use these libraries but they /will/ need change in the future, lest evolution stagnate. But I should stop preaching to the choir. As the survey shows, backwards-compatibility is very important /and/ users praise rapid evolution. Thus we should work to solve the apparent dichotomy. ECT can do it, but, as you state, the question is just, is it too painful? I think not, because: 1. Initial adoption consists of - renaming modules - easy to automate, - changing imports - also possible to automate, and - creating the "short cut" modules - very easy to automate. The semester ends in two weeks and I'd be happy to contribute. If there is interest, I will get to work. 2. Proper maintenance means paying attention to notice incompatible interface changes. Surely this is currently no different. Then, upon an interface change: - renaming the module - trivial, and - retaining the old version in some form, either by o keeping a copy of the old code in place - trivial but bloaty, o implementing a compatibility adaptor, or o if the change was actually backwards-compatible (as it will undoubtedly happen as well), re-export the new version - a convenience script job. What am I missing? Anyway, thanks for reading my article! With best regards, Pesco alias Sven Moritz

On Wed, 29 Jun 2005, Sven Moritz Hallberg
Am 29. Jun 2005 um 11.03 Uhr schrieb Simon Marlow:
On 28 June 2005 14:11, Bulat Ziganshin wrote:
[...] how about using Pesco's library versioning scheme? (see http://www.haskell.org/tmrwiki/EternalCompatibilityInTheory)
I've read that article, and I think it's an interesting idea. I can't disagree with the arguments put forward, but somehow, the cure seems a bit painful.
[...]
What am I missing?
I agree that the benefits of your scheme are desirable. However, in some sense the method amounts to manual version control. We have nice tools that take care of version control, and I wouldn't want to do that by hand. As a simple (but perhaps contrived) example, if a project consists of 50 modules, then after on average 10 changes per module the project will consist of at least 500 modules. If a feature from the first version of some module turned out to be buggy, and all versions needed to be fixed, then ~10 modules might need to be split into two (as per your A_3_9 and A_4_1 example). With a revision control system it should suffice to patch one module, and then users could choose whether to include that patch or not. (Of course, a basic revision control system has other drawbacks compared to your scheme.) -- /NAD

Hello Simon, Wednesday, June 29, 2005, 1:03:06 PM, you wrote:
1) GHCi compiles to bytecode several times faster than GHC makes unoptimized compilation. can unoptimized GHC compilation just create bytecode (as Ocaml does)?
SM> Do you really mean "several times faster"? My impression is that it's a SM> few percent faster, if at all. Some measurements would be good. We've duron 1ghz, 4000 source lines in 14 modules: ghc compile - 18 secs ghc linking - 2 secs ghci load - 8 secs runhugs - 0.9 secs so with bytecode i will get 2 times faster compilation, but it still 10 times worse than with HUGS. i seen a plans of integrating HUGS and GHC run-time environments but as i see it was not implemented btw, are you profiled GHC? what processes in compilation need most time? SM> thought about compiling to persistent byte code in the past, but the SM> gains never seemed to be worth the effort: we'd need a byte-code SM> serialiser and loader. you already have Binary.hs as a base, may be creating serializer will be not so great work?
http://www.haskell.org/tmrwiki/EternalCompatibilityInTheory)
SM> I've read that article, and I think it's an interesting idea. I can't SM> disagree with the arguments put forward, but somehow, the cure seems a SM> bit painful. at least from User point of view, this looks beatiful. from viewpoint of library creator/maintainer this can create some problems. but i think it's not needed to create old interfaces to new libs for each library. saving old library version will be enough. so, for example, we will have libraries directory with newest versions of bundled libs and libversions directory, which will contain Data.Map.6.2.hs and so on i still waiting an answers about "real world" questions :) btw, in Pugs sources (http://search.cpan.org/CPAN/authors/id/A/AU/AUTRIJUS/Perl6-Pugs-6.2.7.tar.gz) there is Unicode.hs module wich can classify and convert full range of Unicode symbols under any OS. i don't understand - is this module already included in GHC 6.5? if not, it would be good addition -- Best regards, Bulat mailto:bulatz@HotPOP.com
participants (4)
-
Bulat Ziganshin
-
Nils Anders Danielsson
-
Simon Marlow
-
Sven Moritz Hallberg