
It seems you decided to ignore my message. OK.
Whoa there! Why assume malice? I got both his quoted response and your message at about the same time (...)
(...) *dismisses* Gofer, as something so old that it couldn't be possibly related to modern languages. Mind you, Mark Jones did it before 1994,(...) OK, don't shout, I know I exaggerate... What I found somehow funny, with all respect, is the combination: 20 years ago, which means '87, and miserable 4M of memory. At that time 4M on a personal computer was not so frequent, at least in Europe. Jerzy Karczmarczuk
Sorry, Jerzy. Brandon message was just faster to answer, I'll need some time to check Gofer. I also wrote PC-AT with 256KB in my original message, but I changed it to 386 since I didn't want you guys to feel under an attack from reactionarys. People feelings are easy to hurt, and it's difficult to please everyone :) 20 years ago, I wrote a brute force attack on a magazine game, but when my TK-3000 Apple found an answer the due date had long passed and I could not get a prize from the magazine. In '92, when my family got a PC-AT, the same game was solved in 5 minutes, so to this day that PC is still my psicological reference of "all the power I need". I enjoyed a Prolog compiler in that system, and my intuition says Haskell could also fit there. And, at the same time, today operating systems are happy to announce how easily they turn your dual-core into a great video cassette :( Anyway, what I would like would be a "theoretical" answer. Is there something fundamentally diferent between a C compiler and a Haskell one that makes the former fits into 30Kb but not the other? If so, what compiler algorithms are those, and what are they theorical needs? I really don't need an easy answer. If you guys just sugest me computer theory books I will be happy to look for them, but today I wouldn't know what to read. Best, Maurício