Re: Interesting: "Lisp as a competitive advantage"

http://www.paulgraham.com/paulgraham/avg.html
I wonder how Haskell compares in this regard.
I loved Graham's characterization of the hierarchy of power in programming languages: - Languages less powerful than the one you understand look impoverished - Languages more powerful than the one you understand look weird When I compare Lisp and Haskell, the big question in my mind is this: is lazy evaluation sufficient to make up for the lack of macros? I would love to hear from a real Lisp macro hacker who has also done lazy functional progrmaming. Norman

----- Original Message -----
From: "Norman Ramsey"
http://www.paulgraham.com/paulgraham/avg.html
I wonder how Haskell compares in this regard.
I loved Graham's characterization of the hierarchy of power in programming languages:
- Languages less powerful than the one you understand look impoverished - Languages more powerful than the one you understand look weird
Same for me; although you should not fall into the trap of reversing it, ie if the language looks weird is is more powerful!
When I compare Lisp and Haskell, the big question in my mind is this: is lazy evaluation sufficient to make up for the lack of macros?
Don't you get dynamic scoping as well with macros?
I would love to hear from a real Lisp macro hacker who has also done lazy functional progrmaming.
Norman
_______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe

Norman Ramsey wrote:
I would love to hear from a real Lisp macro hacker who has also done lazy functional progrmaming.
I am such a person. Lisp macros are a way to extend the Lisp compiler. Dylan's example shows why this reflective power is sometimes useful. Here is another example. I once wrote a macro to help express pattern-matching rules. In these rules, variables that began with a question mark were treated specially. Having learned Haskell, I am not tempted to go back to Lisp. Yet I occasionally wish for some sort of reflective syntactic extension. - Tim Sauerwein

Tim Sauerwein wrote:
I once wrote a macro to help express pattern-matching rules. In these rules, variables that began with a question mark were treated specially.
David Gifford's Programming Languages class at MIT uses Scheme+, a variant of MIT Scheme with datatypes and pattern matching. These extensions are implemented as macros. (http://tesla.lcs.mit.edu/6821) But of course Haskell has those already :) -- Mieszko

Discussion about macros, Lisp, laziness etc. Too many people to cite. Alan Bawden uses macros to write assertions, and Dylan Thurston comments: ...
(assert (< x 3))
Which macro expands into:
(if (not (< x 3)) (assertion-failed '(< x 3)))
Where `assertion-failed' is a procedure that generates an appropriate error message. The problem being solved here is getting the asserted expression into that error message. I don't see how higher order functions or lazy evaluation could be used to write an `assert' that behaves like this.
This is a good example, which cannot be implemented in Haskell. "Exception.assert" is built in to the ghc compiler, rather than being defined within the language. On the other hand, the built in function gives you the source file and line number rather than the literal expression; the macro can't do the former.
--Dylan Thurston
In general this is not true, look at the macro preprocessing in C. If your parser is kind enough to yield to the user some pragmatic information about the read text, say, __LINE etc., you can code that kind of control with macros as well. Macros in Scheme are used to unfold n-ary control structures such as COND into a hierarchy of IFs, etc. Nothing (in principle) to do with laziness or HO functions. They are used also to define object-oriented layers in Scheme or Lisp. I used them to emulate curryfied functions in Scheme. I think that they are less than popular nowadays because they are dangerous, badly structured, difficult to write "hygienically". Somebody (Erik Meijer?) asked: "Don't you get dynamic scoping as well with macros?" Well, what is dynamic here? Surely this is far from "fluid" bindings, this is a good way to produce name trapping and other diseases. In Clean there are macros. They are rather infrequently used... In C++ a whole zone of macro/preprocessor coding began to disappear with the arrival of inlining, templates, etc. I think that macros belong to *low-level* languages. Languages where you feel under the parsing surface the underlying virtual machine. You can do fabulous things with. My favourite example is the language BALM, many years before ML, Haskell etc., there was a functional, Lisp-like language with a more classical, Algol-like syntax, with infix operators, etc. The language worked on CDC mainframes, under SCOPE/NOS. Its processor was written in assembler (Compass). But you should have a look on it imple- mentation! Yes, assembler, nothing more. But this assembler was so macro- oriented, and so incredibly powerful, that the instructions looked like Lisp. With recursivity, parentheses, LET forms which allocated registers, and other completely deliciously crazy constructs. In fact, the authors used macros to implement the entire Lisp machine, used to process BALM programs. //Side remark: don't ask me where to find BALM. I tried, I failed. If *YOU* find it, let me know// Another place where macros have been used as main horses was the MAINBOL implementation of Snobol4. But when people started to implement Spitbol etc. variants of Snobol4, they decided to use more structured, higher-level approach (there was even an another, portable assembler with higher-level instructions "embedded"; these avoided the usage of macros). Jerzy Karczmarczuk Caen, France

Fri, 04 May 2001 12:57:29 +0200, Jerzy Karczmarczuk
In Clean there are macros. They are rather infrequently used...
I think they roughly correspond to inline functions in Haskell. They are separate in Clean because module interfaces are written by hand, so the user can include something to be expanded inline in other modules by making it a macro. In Haskell module interfaces are generated by the compiler, so they can contain unfoldings of functions worth inlining without explicit distinguishing in the source. -- __("< Marcin Kowalczyk * qrczak@knm.org.pl http://qrczak.ids.net.pl/ \__/ ^^ SYGNATURA ZASTÊPCZA QRCZAK

Norman Ramsey writes:
When I compare Lisp and Haskell, the big question in my mind is this: is lazy evaluation sufficient to make up for the lack of macros?
it might make sense for Haskell to have a facility that makes it possible for the programmer to define new bits of syntactic sugar without changing the compiler. eg, I recently wanted case2 foo of ... as sugar for foo >>= \output-> case output of ... if you want to call such an easy-sugar-making facility a macro facility, fine by me. personally, I wouldnt bother with such a facility. aside from simple macros that are just sugar, macros go against the Haskell philosophy, imo, because macros do not obey as many formal laws as functions do and because a macro is essentially a new language construct. rather than building what is essentially a language with dozens and dozens of constructs, the Haskell way is to re-use the 3 constructs of lambda calculus over and over again (with enough sugar to keep things human-readable).

On 04-May-2001, Marcin 'Qrczak' Kowalczyk
Jerzy Karczmarczuk
pisze: In Clean there are macros. They are rather infrequently used...
I think they roughly correspond to inline functions in Haskell.
They are separate in Clean because module interfaces are written by hand, so the user can include something to be expanded inline in other modules by making it a macro.
In Haskell module interfaces are generated by the compiler, so they can contain unfoldings of functions worth inlining without explicit distinguishing in the source.
I don't think that Clean's module syntax is the reason.
(Or if it is the reason, then it is not a _good_ reason.)
After all, compilers for other languages where module interfaces
are explicitly written by the programmer, e.g. Ada and Mercury, are
still capable of performing intermodule inlining and other intermodule
optimizations if requested.
My guess is that the reason for having macros as a separate construct
is that there is a difference in operational semantics, specifically
with respect to lazyness, between macros and variable bindings.
However, this is just a guess; I don't know Clean very well.
--
Fergus Henderson

Sat, 5 May 2001 04:44:15 -0700 (PDT), Richard
eg, I recently wanted
case2 foo of ...
as sugar for
foo >>= \output-> case output of ...
Yes, often miss OCaml's 'function' and SML's 'fn' syntax which allow dispatching without inventing a temporary name for the argument nor the function. Today I ran across exactly your case. In non-pure languages you would just write 'case foo of'. I would be happy with just 'function': get >>= function ... -> ... ... -> ... I wonder if these parts of Haskell's syntax will stay forever or there is a chance of some more syntactic sugar. -- __("< Marcin Kowalczyk * qrczak@knm.org.pl http://qrczak.ids.net.pl/ \__/ ^^ SYGNATURA ZASTÊPCZA QRCZAK

Marcin 'Qrczak' Kowalczyk schrijft:
I think [Clean macros] roughly correspond to inline functions in Haskell.
That's right. I think the most important difference is that Clean macros can also be used in patterns (if they don't have a lower case name or contain local functions). The INLINE pragma for GHC is advisory, macros in Clean will always be substituted.
They are separate in Clean because module interfaces are written by hand, so the user can include something to be expanded inline in other modules by making it a macro.
In Haskell module interfaces are generated by the compiler, so they can contain unfoldings of functions worth inlining without explicit distinguishing in the source.
Fergus Henderson replies:
I don't think that Clean's module syntax is the reason. (Or if it is the reason, then it is not a _good_ reason.) [...]
You're right, having hand written interfaces doesn't preclude compiler written interfaces (or optimisation files). Let's call it a pragmatic reason: Clean macros are there because we don't do any cross-module optimisations and we do want some form of inlining. Cheers, Ronny Wichers Schreur
participants (9)
-
elf@sandburst.com
-
Erik Meijer
-
Fergus Henderson
-
Jerzy Karczmarczuk
-
Marcin 'Qrczak' Kowalczyk
-
Norman Ramsey
-
Richard
-
Ronny Wichers Schreur
-
Tim Sauerwein