
On 11 Oct 2007, at 1:00 pm, jerzy.karczmarczuk@info.unicaen.fr wrote:
An anonymous called ok writes:
I am not anonymous. That is my login and has been since 1979.
jerzy.karczmarczuk wrote [about "R"]:
... This is not a functional language. There is some laziness (which looks a bit like macro- processing), sure. There is no macro processing in R (or S).
I know I've been superficial, but, please, *try* to understand my point.
Before anyone can try to understand a point, it has to be made.
There is a cheap (not always) way of making everything "lazy", by rewriting. If an expression is the argument of a function, what is passed is the representation of this expression. This gets evaluated in the context of the caller function, although perhaps in the environment of the argument itself, if there are external references. It is something *similar* to macros, and it is more or less what I understood from my - admittedly weak - knowledge of R, S, etc. (Frankly, I do not know them enough to make the difference). But this is not the same as the laziness - realization of the normal order of evaluation, call by name (need), etc.
First off, as someone who has implemented a couple of macro processors, I completely fail to see any similarity between S/R arguments and macro processing. YES an (expression, environment) pair is passed. But can you try to see *my* point? HOW THE THING IS IMPLEMENTED is completely unimportant (except that the approach S and R take makes strictly more things possible than the approach that GHC takes). What matters is WHAT THE BEHAVIOUR IS. And what you get is *precisely* call by need.
There is a difference between call by name, and by reference.
I know that; the implementors of R (one of whom I know) know that also. The contrast is precisely a contrast against call by *reference* because the *language* contrast that was salient for most S users was the contrast between S and Fortran, and it is pass by reference that Fortran has, not call by name.
Right. But then, laziness *AND* side effects may put you in a nice mess...
Indeed it does. I didn't say I thought it was a *good* mix, just that it *exists*. Now, the discussion
began with ideas how to advertize *functional* languages, not packages with dangerous, non-formalizable semantics,
I do not know where you get the idea that S semantics is not formalisable. The principal S reference contains a meta-circular interpreter. "Can lead the unwary into traps" is not at all the same as "cannot be formalised".
OF COURSE, there are untyped languages with suspended evaluation. Snobol had "unevaluated expressions", Icon has "co-expressions", etc.
This is once again to evade the point. (As it happens, I have, and occasionally use, both Icon and SNOBOL. I wonder how many other SNOBOL users remain.) In SNOBOL and Icon these things are *exceptions*; the normal argument passing convention is otherwise. In S (and therefore R), there is, as in Haskell, only ONE way to pass arguments, and that is call by need. Call by need in R is *not* exceptional. It isn't even just the norm. It is the *only* argument passing technique on offer.
But if the merits of FL include some protection against errors, issued from enforcing a concrete programming discipline, "R" doesn't seem to me a good example.
I was on the R mailing list for a couple of years. What a torrent of messages that was! Interestingly, the troubles you fear (with a mix of imperative actions and call by need) do not seem to be troublesome in practice. There really doesn't seem to be much if any need to protect against *those* errors. One of the commonest mistakes is using "&" (or "|") where "&&" (or "||") should have been used, and this is something that a type system might have helped with. But it is precisely a dynamically typed lazy language you were after, so "the lack of a static type system has more to do with errors than the possibility of mixing imperative actions with laziness" is probably something you do not want to hear. I wrote:
I don't know what co-inductive constructions are.
OK, try to code in R the list [0,1,2,3, ...] using a co-recursive data definition,
You can safely assume that someone who doesn't know what co-inductive constructions are also doesn't know what co-recursive data definitions are. ("doesn't know" as in "is unfamiliar with the jargon", not as in "has never met the concept".)
in Haskell: integs = 0 : (ones + integs) where ones = 1 : ones and where (+) acts on lists element-wise.
I believe I already explained the reason that this is hard in R: it doesn't evaluate *arguments* but it does fully evaluate *results*. I think that's a much more insightful thing to say about R than to go on misleadingly about macros. Let's leave R behind. If one wants a lazy dynamically typed programming language that lets you construct "infinite lists" by using the basic language mechanisms in a simple and direct way, there's always Recanati's Lambdix, which is a lazy Lisp. I don't know whether that ever saw serious use, but it does show that the thing can be done.