How much of Haskell was possible 20 years ago?

Hi, I like Haskell, and use it as my main language. However, compiling a Haskell program usually takes a lot of memory and CPU. So I was curious, and would like to know from computer scholars in this list: how much of Haskell would be possible in machines with really low CPU and memory? Which features would be feasible for a compiler to implement, and for programmers to use? Thanks, Maurício

Maurício writes:
... compiling a Haskell program usually takes a lot of memory and CPU. So I was curious, and would like to know from computer scholars in this list: how much of Haskell would be possible in machines with really low CPU and memory? Which features would be feasible for a compiler to implement, and for programmers to use?
Check up Gofer of Mark Jones. http://web.cecs.pdx.edu/~mpj/goferarc/index.html Choose your preferred version from: http://citeseer.ist.psu.edu/jones94implementation.html Jerzy Karczmarczuk

On Oct 21, 2007, at 14:40 , Maurí cio wrote:
I like Haskell, and use it as my main language. However, compiling a Haskell program usually takes a lot of memory and CPU. So I was
To some extent this is just a matter of Haskell not having been around that long ago: as ghc evolves, it's been getting better about this. Before ghc snapshots stopped being able to compile themselves (*grumble* --- is this fixed yet?) I found that ghc 6.7 compiling itself didn't do nearly as much violence to my desktop machine as compiling 6.7 (or 6.8pre or 6.9) with ghc 6.4 / 6.6 / 6.6.1. -- brandon s. allbery [solaris,freebsd,perl,pugs,haskell] allbery@kf8nh.com system administrator [openafs,heimdal,too many hats] allbery@ece.cmu.edu electrical and computer engineering, carnegie mellon university KF8NH

Brandon S. Allbery KF8NH escreveu:
On Oct 21, 2007, at 14:40 , Maurí cio wrote:
I like Haskell, and use it as my main language. However, compiling a Haskell program usually takes a lot of memory and CPU. So I was
To some extent this is just a matter of Haskell not having been around that long ago: as ghc evolves, it's been getting better about this. Before ghc snapshots stopped being able to compile themselves (*grumble* --- is this fixed yet?) I found that ghc 6.7 compiling itself didn't do nearly as much violence to my desktop machine as compiling 6.7 (or 6.8pre or 6.9) with ghc 6.4 / 6.6 / 6.6.1.
Of course. But I think of somethink like a Intel 386 with 4MB of memory.

On Sun, 21 Oct 2007, [ISO-8859-1] Maurício wrote:
Of course. But I think of somethink like a Intel 386 with 4MB of memory.
According to "The History of Haskell" http://www.haskell.org/haskellwiki/History_of_Haskell (early versions of) Haskell could be used on such machines.

On Oct 21, 2007, at 15:21 , Maurí cio wrote:
Of course. But I think of somethink like a Intel 386 with 4MB of memory.
It's kinda surprising to me how many people think that just because current/modern implementations of things use memory wastefully, this is somehow mandatory. When machines were smaller, programs used algorithms which were suited to those machines; the reason they don't now is to some extent laziness but also because those algorithms often didn't scale to larger available memory (i.e. you get *big* speedups with more profligate algorithms when the memory they want is available). -- brandon s. allbery [solaris,freebsd,perl,pugs,haskell] allbery@kf8nh.com system administrator [openafs,heimdal,too many hats] allbery@ece.cmu.edu electrical and computer engineering, carnegie mellon university KF8NH

Mauricio writes:
... But I think of somethink like a Intel 386 with 4MB of memory.
It seems you decided to ignore my message. OK. I repeat it once more, and there will be no more. Gofer was a perfectly genuine Haskell, it run in a 640K box, and influenced considerably the type system of newer Haskell. Jerzy Karczmarczuk

On Oct 21, 2007, at 15:31 , jerzy.karczmarczuk@info.unicaen.fr wrote:
Mauricio writes:
... But I think of somethink like a Intel 386 with 4MB of memory.
It seems you decided to ignore my message. OK.
Whoa there! Why assume malice? I got both his quoted response and your message at about the same time, which suggests to me he hadn't seen yours yet when he sent it. -- brandon s. allbery [solaris,freebsd,perl,pugs,haskell] allbery@kf8nh.com system administrator [openafs,heimdal,too many hats] allbery@ece.cmu.edu electrical and computer engineering, carnegie mellon university KF8NH

Brandon S. Allbery KF8NH writes:
jerzy.karczmarczuk@info.unicaen.fr wrote:
It seems you decided to ignore my message. OK.
Whoa there! Why assume malice? I got both his quoted response and your message at about the same time, ...
Gosh, I am not assuming any malice, I am too old for that... I could have thought, since it has happened already, that Mauricio, or whoever *dismisses* Gofer, as something so old that it couldn't be possibly related to modern languages. Mind you, Mark Jones did it before 1994, when the paper reporting the implementation has been written. Several guys here were not born yet. OK, don't shout, I know I exaggerate... What I found somehow funny, with all respect, is the combination: 20 years ago, which means '87, and miserable 4M of memory. At that time 4M on a personal computer was not so frequent, at least in Europe. Jerzy Karczmarczuk

It seems you decided to ignore my message. OK.
Whoa there! Why assume malice? I got both his quoted response and your message at about the same time (...)
(...) *dismisses* Gofer, as something so old that it couldn't be possibly related to modern languages. Mind you, Mark Jones did it before 1994,(...) OK, don't shout, I know I exaggerate... What I found somehow funny, with all respect, is the combination: 20 years ago, which means '87, and miserable 4M of memory. At that time 4M on a personal computer was not so frequent, at least in Europe. Jerzy Karczmarczuk
Sorry, Jerzy. Brandon message was just faster to answer, I'll need some time to check Gofer. I also wrote PC-AT with 256KB in my original message, but I changed it to 386 since I didn't want you guys to feel under an attack from reactionarys. People feelings are easy to hurt, and it's difficult to please everyone :) 20 years ago, I wrote a brute force attack on a magazine game, but when my TK-3000 Apple found an answer the due date had long passed and I could not get a prize from the magazine. In '92, when my family got a PC-AT, the same game was solved in 5 minutes, so to this day that PC is still my psicological reference of "all the power I need". I enjoyed a Prolog compiler in that system, and my intuition says Haskell could also fit there. And, at the same time, today operating systems are happy to announce how easily they turn your dual-core into a great video cassette :( Anyway, what I would like would be a "theoretical" answer. Is there something fundamentally diferent between a C compiler and a Haskell one that makes the former fits into 30Kb but not the other? If so, what compiler algorithms are those, and what are they theorical needs? I really don't need an easy answer. If you guys just sugest me computer theory books I will be happy to look for them, but today I wouldn't know what to read. Best, Maurício

On Oct 21, 2007, at 21:31 , Maurí cio wrote:
Anyway, what I would like would be a "theoretical" answer. Is there something fundamentally diferent between a C compiler and a Haskell one that makes the former fits into 30Kb but not the other? If
I am not sure *modern* C would have fit into 30KB. Back then, C was the original K&R C, which was in many ways much lower level than ANSI C; those compilers could be quite compact. (I remember playing with SmallC, which compiled a subset of C which covered 95% of what features programmers actually used back then and was extremely compact.) There are two reasons ANSI C would be unlikely to fit: modern C compilers do optimization, which (aside from very simplistic "peephole" optimization, needs lots of memory), and the ANSI C type system (such as it is) is complex enough to require more memory to deal with properly. In early K&R C (which was still relatively close to its untyped predecessor BCPL), many things were "untyped" and effectively (int) (a machine word); non-(int) integral types were promoted to (int), and (float) was promoted to (double), wherever possible to simplify things for the compiler. You could use more complex types to some extent, but they tended to perform poorly enough that much effort went into avoiding them. Even in later K&R C (which still did type promotion but deprecated "untyped" functions and removed "untyped" variables), people avoided using even (long int) except through pointers because they were so expensive to use as function arguments and return values (and some compilers, mostly on some mainframe architectures, *couldn't* return (long int) safely). Both of these also apply to Haskell. In particular, you can do a naive compilation of Haskell code but it will perform very poorly --- Haskell *needs* a good optimizer. (Compare the performance of code compiled with unregisterised GHC to that compiled with normal, highly optimized GHC sometime.) Additionally, even Haskell98 can need quite a bit of memory to do type unification if you don't explicitly specify every type everywhere; and this gets much worse with the extensions in modern Haskell (in particular, functional dependencies complicate type unification). There may be theory issues which also impact the question, but in large part it's not theory so much as practical concerns. -- brandon s. allbery [solaris,freebsd,perl,pugs,haskell] allbery@kf8nh.com system administrator [openafs,heimdal,too many hats] allbery@ece.cmu.edu electrical and computer engineering, carnegie mellon university KF8NH

On Sun, Oct 21, 2007 at 10:02:25PM -0400, Brandon S. Allbery KF8NH wrote:
On Oct 21, 2007, at 21:31 , Maurí cio wrote:
Anyway, what I would like would be a "theoretical" answer. Is there something fundamentally diferent between a C compiler and a Haskell one that makes the former fits into 30Kb but not the other? If
Another large part of the issue is that C compilers tend to be written in C, while Haskell compilers tend to be written in Haskell. C, as a lower level language, strongly encourages programmers to think about issues such as space and time usage, at the cost of much more development time. In terms of simple complexity, the difference isn't all that big - the original NHC (a minimalistic haskell compiler) weighed in at around 9 kloc. How big is your favorite C compiler in source-code terms? Stefan

At Sun, 21 Oct 2007 17:21:00 -0200, Maurício wrote:
Brandon S. Allbery KF8NH escreveu:
On Oct 21, 2007, at 14:40 , Maurí cio wrote:
I like Haskell, and use it as my main language. However, compiling a Haskell program usually takes a lot of memory and CPU.
Last night I was running top, and noticed cc1 consuming 101MB of RAM :) I have also seen ar (the thing that makes .a files) consume > 512MB of RAM. So, perhaps GHC is just trying to keep up with C :) j.

On Mon, 2007-10-22 at 10:05 -0700, Jeremy Shaw wrote:
I like Haskell, and use it as my main language. However, compiling a Haskell program usually takes a lot of memory and CPU.
Last night I was running top, and noticed cc1 consuming 101MB of RAM :) I have also seen ar (the thing that makes .a files) consume > 512MB of RAM.
You need to upgrade your binutils! I sent in a patch to binutils to fix exactly this problem that ar/ranlib takes so much memory when building large ghc "split-objs" libraries. The fix got included in binutils 2.17. I do hope Linspire is using a version with the fix ;-). Duncan

All of Haskell was possible 20 years ago. The LML compiler (written in LML)
compiled a language similar to Haskell, the only real differences is syntax
and the type system (and monadic IO wasn't invented yet). It was a bit slow
to recompile itself, but not bad. A 16MHz 386 and 8M of memory certainly
sufficed.
-- Lennart
On 10/21/07, Maurício
Hi,
I like Haskell, and use it as my main language. However, compiling a Haskell program usually takes a lot of memory and CPU. So I was curious, and would like to know from computer scholars in this list: how much of Haskell would be possible in machines with really low CPU and memory? Which features would be feasible for a compiler to implement, and for programmers to use?
Thanks, Maurício
_______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
participants (8)
-
Brandon S. Allbery KF8NH
-
Duncan Coutts
-
Henning Thielemann
-
Jeremy Shaw
-
jerzy.karczmarczuk@info.unicaen.fr
-
Lennart Augustsson
-
Maurício
-
Stefan O'Rear