
I only just subscribed to this mailing list and I am a complete Haskell newbie, so forgive me if this is too OT. I noticed a recent thread about writing a Mathematica implementation in Haskell. I think this is an excellent idea and would be a great project for a Haskell newbie. I wrote a toy Mathematica implementation in OCaml while I waited to be viva'd for my PhD. It garnered so much interest that Wolfram Research bought it from me for £4,500 and gave me several free copies of Mathematica. Regarding the specific points made: 1. Numerical libraries: you should be able to reuse existing libraries like GMP, BLAS, LAPACK, FFTW and so on. These are often much faster than Mathematica's. For example, FFTW was about 4x faster than Mathematica's FFT the last time I benchmarked it. However, they do not support interval arithmetic. 2. GUI: I would take our existing vector graphics software: http://www.ffconsultancy.com/products/smoke_vector_graphics/ http://www.ffconsultancy.com/products/fsharp_for_visualization/ and rewrite it in Haskell as the foundation. This would far exceed anything that Mathematica has to offer, in part because Mathematica's graphics are still evaluated via the completely generic rewrite engine which is extremely slow. Our code already implements high-performance hardware-accelerated vector graphics and it is probably one of the first things I would consider porting to Haskell (if there is any commercial interest in such a library). 3. The language: the hardest part of reimplementing Mathematica is inferring what it means (there are no formal evaluation semantics). Once you've done that it is just a case of implementing an extensible term rewriter and putting in about 20 core rules. The pattern matcher is well under 100 LOC and you can do various things to make it more efficient. There are two tricks that vastly improve performance of the rewriter: return physically identical results whenever possible, and perform substitution and evaluation at the same time rather than as two separate passes. 4. Libraries: You should have no trouble exceeding the capabilities of Mathematica by pulling in existing libraries. For example, Mathematica provides no wavelet transforms, no time-frequency transforms, no function minimization over an arbitrary number of variables etc. Having worked in numerical computing for many years, I can say that Mathematica is an excellent tool for prototyping and for doing disposable analyses but many people reach for conventional languages when Mathematica's capabilities run dry. You should easily be able to implement a rewriter for the language that is ten times faster and doesn't leak. Incidentally, my implementation of Mathematica in OCaml took four days, and it was one of my first OCaml programs. -- Dr Jon D Harrop, Flying Frog Consultancy Ltd. OCaml for Scientists http://www.ffconsultancy.com/products/ocaml_for_scientists/?e

Jon Harrop wrote:
I noticed a recent thread about writing a Mathematica implementation in Haskell.
Yeah, that was me.
I think this is an excellent idea and would be a great project for a Haskell newbie.
Uh... I think it's actually a tad harder than it looks. [Understatement!]
I wrote a toy Mathematica implementation in OCaml while I waited to be viva'd for my PhD. It garnered so much interest that Wolfram Research bought it from me for £4,500 and gave me several free copies of Mathematica.
Are you serious?! o_O
Regarding the specific points made:
1. Numerical libraries: you should be able to reuse existing libraries like GMP, BLAS, LAPACK, FFTW and so on. These are often much faster than Mathematica's. For example, FFTW was about 4x faster than Mathematica's FFT the last time I benchmarked it. However, they do not support interval arithmetic.
Now this is interesting. The claim is that Mathematica is the fastest, most powerful software on planet earth, second to none. Actually it turns out that at least for factoring moderately big integers, pari/gp seems to be a fair bit faster (50% or so). I have no idea about the rest. Note that (as I understand it) GHC implements Haskell's Integer type using the GMP. And for some reason or other, they want to remove this feature...
2. GUI: I would take our existing vector graphics software:
http://www.ffconsultancy.com/products/smoke_vector_graphics/ http://www.ffconsultancy.com/products/fsharp_for_visualization/
and rewrite it in Haskell as the foundation. This would far exceed anything that Mathematica has to offer, in part because Mathematica's graphics are still evaluated via the completely generic rewrite engine which is extremely slow. Our code already implements high-performance hardware-accelerated vector graphics and it is probably one of the first things I would consider porting to Haskell (if there is any commercial interest in such a library).
Erm... have you seen Mathematica 6? That's OpenGL accelerated too. I've just been playing with it in fact - it's pretty fast as far as I can tell.
3. The language: the hardest part of reimplementing Mathematica is inferring what it means (there are no formal evaluation semantics). Once you've done that it is just a case of implementing an extensible term rewriter and putting in about 20 core rules. The pattern matcher is well under 100 LOC and you can do various things to make it more efficient. There are two tricks that vastly improve performance of the rewriter: return physically identical results whenever possible, and perform substitution and evaluation at the same time rather than as two separate passes.
Haskell has pattern matching, but what Mathematica does is much more sophisticated. I have tried to implement it several times, and failed. (But that was Pascal, this is Haskell...)
4. Libraries: You should have no trouble exceeding the capabilities of Mathematica by pulling in existing libraries. For example, Mathematica provides no wavelet transforms, no time-frequency transforms, no function minimization over an arbitrary number of variables etc.
What...the...hell...? Mathematica contains the largest, most comprehensive set of implementations of special functions anywhere in the world. It has a *vast* collection of identities and transformation rules constituting man-centuries of R&D work. It has cutting edge symbolic integration capabilities. It has multiple numerical solver algorithms. It has... Yeah, should only take 5 minutes or so to exceed those capabilities. Easy really...
You should easily be able to implement a rewriter for the language that is ten times faster and doesn't leak.
Incidentally, my implementation of Mathematica in OCaml took four days, and it was one of my first OCaml programs.
OK, so you're saying that in 4 days you wrote something that out-performs Mathematica, a program that has existed for decades and has a vast, highly-funded R&D effort behind it featuring some of the brightest minds in the field? I'm in a state of disbelief here.

Hallo,
On 5/30/07, Andrew Coppin
OK, so you're saying that in 4 days you wrote something that out-performs Mathematica, a program that has existed for decades and has a vast, highly-funded R&D effort behind it featuring some of the brightest minds in the field?
I'm in a state of disbelief here.
If you want some amusement, just search for "Jon Harrop" in comp.lang.lisp. -- -alex http://www.ventonegro.org/

OK, so you're saying that in 4 days you wrote something that out-performs Mathematica, a program that has existed for decades and has a vast, highly-funded R&D effort behind it featuring some of the brightest minds in the field?
If you want some amusement, just search for "Jon Harrop" in comp.lang.lisp.
Write a newer, better, faster Mathematica in four days! I will show you how! Just attend one of my seminars comming soon to your area! [*]
[*] Some attendees may not actually build a faster Mathematical in four days. Results vary. Tim Newsham http://www.thenewsh.com/~newsham/

Why do you seem so in awe of Mathematica? It's just another language with a good set of libraries. Claims that it is the best, fastest, etc comes from Wolfram advertising, no doubt. :) -- Lennart On Wed, 30 May 2007, Andrew Coppin wrote:
Date: Wed, 30 May 2007 22:15:55 +0100 From: Andrew Coppin
To: haskell-cafe@haskell.org Subject: Re: [Haskell-cafe] Implementing Mathematica Jon Harrop wrote:
I noticed a recent thread about writing a Mathematica implementation in Haskell.
Yeah, that was me.
I think this is an excellent idea and would be a great project for a Haskell newbie.
Uh... I think it's actually a tad harder than it looks. [Understatement!]
I wrote a toy Mathematica implementation in OCaml while I waited to be viva'd for my PhD. It garnered so much interest that Wolfram Research bought it from me for £4,500 and gave me several free copies of Mathematica.
Are you serious?! o_O
Regarding the specific points made:
1. Numerical libraries: you should be able to reuse existing libraries like GMP, BLAS, LAPACK, FFTW and so on. These are often much faster than Mathematica's. For example, FFTW was about 4x faster than Mathematica's FFT the last time I benchmarked it. However, they do not support interval arithmetic.
Now this is interesting. The claim is that Mathematica is the fastest, most powerful software on planet earth, second to none. Actually it turns out that at least for factoring moderately big integers, pari/gp seems to be a fair bit faster (50% or so). I have no idea about the rest.
Note that (as I understand it) GHC implements Haskell's Integer type using the GMP. And for some reason or other, they want to remove this feature...
2. GUI: I would take our existing vector graphics software:
http://www.ffconsultancy.com/products/smoke_vector_graphics/ http://www.ffconsultancy.com/products/fsharp_for_visualization/
and rewrite it in Haskell as the foundation. This would far exceed anything that Mathematica has to offer, in part because Mathematica's graphics are still evaluated via the completely generic rewrite engine which is extremely slow. Our code already implements high-performance hardware-accelerated vector graphics and it is probably one of the first things I would consider porting to Haskell (if there is any commercial interest in such a library).
Erm... have you seen Mathematica 6? That's OpenGL accelerated too. I've just been playing with it in fact - it's pretty fast as far as I can tell.
3. The language: the hardest part of reimplementing Mathematica is inferring what it means (there are no formal evaluation semantics). Once you've done that it is just a case of implementing an extensible term rewriter and putting in about 20 core rules. The pattern matcher is well under 100 LOC and you can do various things to make it more efficient. There are two tricks that vastly improve performance of the rewriter: return physically identical results whenever possible, and perform substitution and evaluation at the same time rather than as two separate passes.
Haskell has pattern matching, but what Mathematica does is much more sophisticated. I have tried to implement it several times, and failed. (But that was Pascal, this is Haskell...)
4. Libraries: You should have no trouble exceeding the capabilities of Mathematica by pulling in existing libraries. For example, Mathematica provides no wavelet transforms, no time-frequency transforms, no function minimization over an arbitrary number of variables etc.
What...the...hell...?
Mathematica contains the largest, most comprehensive set of implementations of special functions anywhere in the world. It has a *vast* collection of identities and transformation rules constituting man-centuries of R&D work. It has cutting edge symbolic integration capabilities. It has multiple numerical solver algorithms. It has...
Yeah, should only take 5 minutes or so to exceed those capabilities. Easy really...
You should easily be able to implement a rewriter for the language that is ten times faster and doesn't leak.
Incidentally, my implementation of Mathematica in OCaml took four days, and it was one of my first OCaml programs.
OK, so you're saying that in 4 days you wrote something that out-performs Mathematica, a program that has existed for decades and has a vast, highly-funded R&D effort behind it featuring some of the brightest minds in the field?
I'm in a state of disbelief here.
_______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe
-- Lennart

This will be a long sermon. Sorry. Lennart Augustsson writes:
Why do you seem so in awe of Mathematica? It's just another language with a good set of libraries. Claims that it is the best, fastest, etc comes from Wolfram advertising, no doubt. :)
All this discussion began to degenerate a bit (I don't know why, but it happens almost always when people begin to speak about Mathematica in a context far from it... There is, it seems, some Original Sin in this business, but most of you are too young to remember the well known Wolfram Controversy when SMP transmuted into Mathematica...) Anyway... Mathematica made its career not as a *language*, and not immediately as a set of libraries, but as an *integrated package* capable of doing SYMBOLIC mathematics, with a very decent user interfacing and graphics a bit better than its competitors. The conditions of its career were far from obvious. The World had many symbolic math packages: Reduce, Macsyma, Schoonschip (beloved by high- energy physicists), Maple, Scratchpad2/Axiom, later MuSIMP/MuMATH for small platforms, etc. The group of Wolfram knew all that, they knew that in order to implement something reasonable, one has to fulfil several conditions. * The algebraic expressions must have a sound design, there must be a sufficiently universal, yet efficient representation of math formulae. For the polynomial arithmetic this is trivial, it is one of my standard Haskell exercices at the undergraduate level. The symbolic differentia- tion as well. But already the polynomial factorization may be a mess, and requires a good deal of algorithmic knowledge. I am reluctant to believe that anybody implemented this in 4 days... Anybody tried Zassenhaus? Not *too* complicated, implemented in Pari and elsewhere, but quite elaborate. For general functors the *simplification* issue is not decidable. You can't assess a given representation as "the best" formula with a given semantics. Again, the simplifier/evaluator is a complicated part of the package, not something you can do in a few days. Please, have a look on the internal structure of DoCon of Sergei Mechveliani, he did a lot of work in Haskell, and the story is still incomplete. (Let's omit the real mess, for example the Risch symbolic integration algorithms, efficient Gröbner bases, etc.) * First symbolic packages treated *first* the symbolic expressions as something to be evaluated/simplified. One sees that Maple has been built on this principle. Mathematica changed a bit the perspective, along - perhaps - the same lines as Schoonschip, where the fundamental stuff was *rewriting/ transformations*. So, Mathematica since the begininng was equipped with a very powerful pattern-matcher/substitution contraption. For the sake of efficiency it was less powerful than a general unifier, but it was really nice (and it existed already in SMP, before the birth of Mathematica). Now, again, somebody would do that in 4 days?? The semantic pattern-matcher within an algebraic package, is worlds apart from the syntactic/structural pattern-matcher of Haskell. This helped a lot to popularize Mathematica, and has been shamelessly abused in the advertising, where our friends used to say "we DO MATHEMATICS with computers". Non-sense, of course... * All the numerical, standard stuff, the interface between the symbolic and the numerical functions, with plots 2D/3D, etc. Too often people speak about that, comparing, say, Matlab with Mathematica (while Matlab has no symbolics, although, being a decent object-oriented language, has tools which permitted the construction of symbolic toolboxes, the linking of the Maple library, etc.) Here the Mathematica team did a serious, thorough job, trying to adapt the richness of this layer to many branches of science and engineering. It was mainly a compilation process, they hardly added anything new, but made a coherent, useful library. Won't repeat it in 4 days, or even in 4 months. ===================================== Is there any sense in "making Mathematica with Haskell"? As a package, most certainly no, too much to implement, for what? In order to have another tool of the kind which already exists? Sergei did a feasibility study, and worked hard on the interplay between mathematical structures and the Haskell type system. This, surely, *IS* a useful work. And it should continue. We (in the generic meaning of the True Believers in the Functional Church) can implement other things, for example some formal mathematics dealing with logic, or with the category theory, or with the computational geometry or with (my dream) the *true* implementation of quantum calculi. Knock, knock! Wake up, the sermon is over. Jerzy Karczmarczuk

On Thursday 31 May 2007 11:39:14 jerzy.karczmarczuk@info.unicaen.fr wrote:
... Mathematica changed a bit the perspective, along - perhaps - the same lines as Schoonschip, where the fundamental stuff was *rewriting/ transformations*. So, Mathematica since the begininng was equipped with a very powerful pattern-matcher/substitution contraption. For the sake of efficiency it was less powerful than a general unifier, but it was really nice (and it existed already in SMP, before the birth of Mathematica). Now, again, somebody would do that in 4 days?? The semantic pattern-matcher within an algebraic package, is worlds apart from the syntactic/structural pattern-matcher of Haskell.
Can you elaborate on this? I would imagine that the pattern matcher in a term-level Haskell interpreter would be quite similar to one from a toy Mathematica-like rewriter. Also, what aspects of this do you think would take more than 4 days? -- Dr Jon D Harrop, Flying Frog Consultancy Ltd. OCaml for Scientists http://www.ffconsultancy.com/products/ocaml_for_scientists/?e

Jon Harrop after myself:
The semantic pattern-matcher within an algebraic package, is worlds apart from the syntactic/structural pattern-matcher of Haskell.
Can you elaborate on this?
I would imagine that the pattern matcher in a term-level Haskell interpreter would be quite similar to one from a toy Mathematica-like rewriter.
Also, what aspects of this do you think would take more than 4 days?
Well, I don't know what is your definition of "toy". I can make in about 15 minutes a toy atomic bomb, using Lego. Certainly, there is a possibility and need for syntactic pattern matching, as in Haskell, although the identification of common terms is so important that I would rather use a unification procedure, like in logic languages. But this is not enough for computer algebra. OK, if you wish some SIMPLE examples (among infinitely more) of something beyond... * the simplification of rational expressions, beginning with the reduction of (x^2-1)/(1+x)-x = 1, ending with a full polynomial factorizer... Gathering of all the 'x in a+x+b+x+ ... + x + ... so as to get M*x is already not so trivial, and requires some decisions. Will you keep your expressions sorted? How?... * sin(z)^2 + ... + cos(z)^2 should simplify to 1 + ... independently of z, so the equivalence of shared expressions should be treated seriously. We are usually not interested in 'term-level' trivialities. If you look at the plethora of pattern-matching functions in Mathematica, all those MemberQ, MatchQ, FreeQ, all those DeleteCases, the functions which give you the *position* at which a given pattern occurs within a formula, etc., you become a bit modest. If you add to it the possibility/necessity of renaming the variables in patterns present within the Function[{x}...] formulae, you see that just assimilating the documentation needs something like 4 days. Of course, this is not restricted to Mathematica. All Comp. Algebra systems have such tools. But, OK, I am old, tired and slow. There are, fortunately for us, some people very fast and efficient. Jerzy Karczmarczuk. PS. Somebody (A. Coppin?) said that Mathematica not without reason costs 10000. Weeeeelll, less than 2000, and for students there are much cheaper possibi- lities. I am the last to make free ads for Wolfram, I recommend the usage of Axiom and Maxima to my students, but there is no need to say publicly some dubious truths about commercial software.

jerzy.karczmarczuk@info.unicaen.fr wrote:
PS. Somebody (A. Coppin?) said that Mathematica not without reason costs 10000. Weeeeelll, less than 2000, and for students there are much cheaper possibi- lities. I am the last to make free ads for Wolfram, I recommend the usage of Axiom and Maxima to my students, but there is no need to say publicly some dubious truths about commercial software.
Yes, that was me. And there's no need for "debious truths" - anybody that wants to can check the price right now: http://store.wolfram.com/view/app/mathematica/ Mmm, that *is* interesting... The price has indeed changed to £2,035. Ah well, it might as well be £2,000,000 for all the difference it makes - I will never own anywhere near that kind of money. :-( PS. I wonder why it costs more on UNIX...?

Andrew Coppin writes about my objection on the Mathematica price he mentioned :
...And there's no need for "debious truths" - anybody that wants to can check the price right now:
http://store.wolfram.com/view/app/mathematica/
Mmm, that *is* interesting... The price has indeed changed to £2,035. Ah well, it might as well be £2,000,000 for all the difference it makes - I will never own anywhere near that kind of money. :-(
Cool down. Take any Google-accessible site which sells Mathematica. Say, http://www.unisoftwareplus.com/products/mathematica/edu.html Single licence - 1345 with 1 year service. I admit that it still something an individual won't pay, this is for a university fellow who got a fat grant and doesn't know where to throw the money out. Check the students' offer: ClassA boxed : 128 dollars. Admit that this is more serious, although *not* for an average French student (1/3 of what he usually pays for his lodging).
PS. I wonder why it costs more on UNIX...?
Simply because it is *easier* to make a deeply integrated, graphics-rich and securised software on Windows. Linux remains behind not because them folks are lazy, the progress *IS* considerable, but because of commercial support the Windows world moves forward a bit faster! Now, Haskell evolves in both worlds, but we have already seen that it was easier to link it with the graphical support on Windows... Such is life. Jerzy Karczmarczuk

jerzy.karczmarczuk@info.unicaen.fr wrote:
The conditions of its career were far from obvious. The World had many symbolic math packages: Reduce, Macsyma, Schoonschip (beloved by high- energy physicists), Maple, Scratchpad2/Axiom, later MuSIMP/MuMATH for small platforms, etc.
I find that statement interesting. I have never come across *any* other package that can perform _symbolic_ mathematics. (Sure, there are packages that can perform specific operations - solving certain kinds of equations, transforming matricies, rearranging formulas. But I have never seen any other package where you can just do *anything*.)
* First symbolic packages treated *first* the symbolic expressions as something to be evaluated/simplified. One sees that Maple has been built on this principle. Mathematica changed a bit the perspective, along - perhaps - the same lines as Schoonschip, where the fundamental stuff was *rewriting/ transformations*. So, Mathematica since the begininng was equipped with a very powerful pattern-matcher/substitution contraption. For the sake of efficiency it was less powerful than a general unifier, but it was really nice (and it existed already in SMP, before the birth of Mathematica). Now, again, somebody would do that in 4 days?? The semantic pattern-matcher within an algebraic package, is worlds apart from the syntactic/structural pattern-matcher of Haskell. This helped a lot to popularize Mathematica, and has been shamelessly abused in the advertising, where our friends used to say "we DO MATHEMATICS with computers". Non-sense, of course...
Pattern matching *so* rich, in fact, that you can even use it to do things that aren't mathematics - although the default input syntax isn't really geared to it. But yes - I have tried to implement that pattern matching engine a couple of times in Pascal. (Remember Pascal?) Getting it to work for a few test cases is easy. Getting it to *properly* handle associativity and commutivity is really nontrivial. (I mean *really* nontrivial! Or perhaps I am an inferior programmer - one of the two!)
* All the numerical, standard stuff, the interface between the symbolic and the numerical functions, with plots 2D/3D, etc. Too often people speak about that, comparing, say, Matlab with Mathematica (while Matlab has no symbolics, etc.) Here the Mathematica team did a serious, thorough job, trying to adapt the richness of this layer to many branches of science and engineering. It was mainly a compilation process, they hardly added anything new, but made a coherent, useful library. Won't repeat it in 4 days, or even in 4 months.
Again, Mathematica has all these functions defined, it has vast *libraries* of identities for transforming them, *and* it has efficient numerical algorithms to compute them. (If you believe the product literature, for many functions there are several different possible algorithms depending on how accurate you want it, what arguments you're trying to compute it on, etc.) I really don't see anybody easily duplicating all this...
===================================== Is there any sense in "making Mathematica with Haskell"? As a package, most certainly no, too much to implement, for what? In order to have another tool of the kind which already exists?
...other symbolic math tools exist?

Andrew Coppin cites me and asks:
jk wrote:
... The World had many symbolic math packages: Reduce, Macsyma, Schoonschip (beloved by high- energy physicists), Maple, Scratchpad2/Axiom, later MuSIMP/MuMATH for small platforms, etc.
I find that statement interesting. I have never come across *any* other package that can perform _symbolic_ mathematics.
(Sure, there are packages that can perform specific operations - solving certain kinds of equations, transforming matricies, rearranging formulas. But I have never seen any other package where you can just do *anything*.)
You must be joking, but OK, I am naive, and I will answer as it were serious. Really, haven't heard about Maple??? http://www.maplesoft.com/ Its limited library is integrated within Matlab Symbolic Toolbox, and if you *have* Maple and Matlab, the latter can use the full force of the former. Maple is commercial as Mathematica, but there are perfectly usable free packages as well, for example Axiom and Maxima (Macsyma rekindled). http://wiki.axiom-developer.org/FrontPage http://maxima.sourceforge.net/ There is also a system constructed in Paderborn, MuPAD, which was free, but for survival reasons it became 100% commercial (although the old free version circulates still on the Web...) http://www.mupad.de/ The are more free stuff, GAP, MaCaulay,... In general, check http://en.wikipedia.org/wiki/Comparison_of_computer_algebra_systems DoCon is mentioned here as well. Some disappeared, but the choice remains pretty large.
But yes - I have tried to implement that pattern matching engine a couple of times in Pascal. (Remember Pascal?) Getting it to work for a few test cases is easy. Getting it to *properly* handle associativity and commutivity is really nontrivial. (I mean *really* nontrivial! Or perhaps I am an inferior programmer - one of the two!)
The mathematically sensitive matching/substitution is a hard task even if you have at your disposal a reasonably full unifier. Forget Pascal, take Prolog, which will save several days/weeks of the implementation of basic stuff. Even then, it will be quite tedious to write a package able say, to reduce rational formulae, to reduce a polynomial modulo an ideal, to implement the basic trig identities, find a reasonable common form for expressions containing complex exponentials AND trigonometrics, etc.
...other symbolic math tools exist?
Ask a few more times.

jerzy.karczmarczuk@info.unicaen.fr wrote:
Andrew Coppin cites me and asks:
I find that statement interesting. I have never come across *any* other package that can perform _symbolic_ mathematics.
(Sure, there are packages that can perform specific operations - solving certain kinds of equations, transforming matricies, rearranging formulas. But I have never seen any other package where you can just do *anything*.)
You must be joking, but OK, I am naive, and I will answer as it were serious. Really, haven't heard about Maple??? http://www.maplesoft.com/
Last I heard, Maple is simply another fast number-chrunking engine.
Its limited library is integrated within Matlab Symbolic Toolbox, and if you *have* Maple and Matlab, the latter can use the full force of the former.
Now Matlab I have heard of. (And I hate it with a passion.)
Maple is commercial as Mathematica, but there are perfectly usable free packages as well, for example Axiom and Maxima (Macsyma rekindled). http://wiki.axiom-developer.org/FrontPage http://maxima.sourceforge.net/
Now this actually looks vaguely promising. (Rather confused at this point as to what the relationship between Axiom, Maxima, Reduce and Sage is...)
The are more free stuff, GAP, MaCaulay,... In general, check http://en.wikipedia.org/wiki/Comparison_of_computer_algebra_systems
I did indeed check that list a while back. I looked at a whole heap of systems listed, but... half of them didn't seem to exist any more, and the other half looked very clunky, had strange syntax, and only handled a few specific constructs.
But yes - I have tried to implement that pattern matching engine a couple of times in Pascal. (Remember Pascal?) Getting it to work for a few test cases is easy. Getting it to *properly* handle associativity and commutivity is really nontrivial. (I mean *really* nontrivial! Or perhaps I am an inferior programmer - one of the two!)
The mathematically sensitive matching/substitution is a hard task even if you have at your disposal a reasonably full unifier. Forget Pascal, take Prolog, which will save several days/weeks of the implementation of basic stuff. Even then, it will be quite tedious to write a package able say, to reduce rational formulae, to reduce a polynomial modulo an ideal, to implement the basic trig identities, find a reasonable common form for expressions containing complex exponentials AND trigonometrics, etc.
LOL! I did try to learn Prolog once... I never did understand how it was able to make impossible leaps of deduction like that. And yet, in other, similar cases, it couldn't figure out the result. Of course, now I know that the answer is a magic trick known as unification (although I still don't know much about that topic). Not really sure how a unifyer would help you manipulate symbolic expressions... If I get time, I might have another go at implementing the pattern matching algorithm in Haskell. Of course, even if it works, that's still a pretty small problem compared to finding a set of transformation rules that cover all the cases you care about, is reasonably efficient, produces the right answers...etc...etc...

Andrew Coppin responds to my rhetorical question :
Really, haven't heard about Maple??? http://www.maplesoft.com/
Last I heard, Maple is simply another fast number-chrunking engine.
Heavens! Now, as a professional teacher, I should not get nervous too fast, but, sorry to say, you become annoying with your *complete* lack of understanding what is the situation in the domain which interests you. Stop "chrunking", please. Maple, Axiom, Derive, etc. are *symbolic/algebraic* packages, period. Number-crunching is NOT their main realm of activity, although some people use them for numerics, in order not to be obliged to shift from one soft to another.
Now Matlab I have heard of. (And I hate it with a passion.)
Hatred, as we all know, is something which will permit you to make many friends and a lot of money. Good luck. But perhaps learn it first, since I am ready to bet a good deal that you don't know why you hate it. The matrix/vector-style, very compact notation is something which is quite popular, perhaps a bit more than a bit similar functional map/zip style. About other, free soft.
Now this actually looks vaguely promising. (Rather confused at this point as to what the relationship between Axiom, Maxima, Reduce and Sage is...)
What is the relationship between various brands of cars, motorcycles, etc.?? Why people agree to buy all of them and not "just the best"? As Jacques Carette said here, and others confirmed, for any reasonably mature product, say, Maple / Mathematica, etc., you will find that in some respects it suits you better than some others, and sometimes not, and either you choose the best/least-bad, or you combine. Axiom (and MuPAD, and also the semi-commercial, actively developed Australian package Magma!) are designed towards a TYPED approach towards mathematical structures - something found usually quite sexy by a typical Haskellian. Macsyma/Maxima, based originally on Lisp was much more dynamic and amorphous, easier to use by newbies. Maple is /par excellence/ procedural. Mathematica insists on rewriting. Sage ... you mean: http://sage.scipy.org/sage/ David Joyner, William Stein et al... , since Sage in the name of at least a dozen of different software products... well, Sage is quite dynamic, but first of all, as compared to the Mathusalem like Macsyma/Maxima, it is a baby! A baby with *all* its potentialities, and all its weaknesses. Scriptable in Python, so accessible on many platforms, and able to use the graphical support thereof, with the possibility to plug-in - as the authors say - the GAP/Pari/GMP algorithms in order to accelerate the low-level processing, etc., - well - *PERHAPS YOU* will tell us what you can do with, that Haskell doesn't give you. (Unfortunately, I couldn't test it thoroughly, since I have only 7 arms and 37 hours per day, but the interactive tutorial is nice). About other systems:
I did indeed check that list a while back. I looked at a whole heap of systems listed, but... half of them didn't seem to exist any more, and the other half looked very clunky, had strange syntax, and only handled a few specific constructs.
The situation evolves, the Internet sites are, true, too often obsolete. Instead of complaining, I test. From time to time you realize that most of the brands of washing machines on the market known to you as a child aren't there. Will you annoy us with the discovery of this highly non-trivial issue? And, concerning your "strange syntax", I don't buy it at all. ===
LOL! I did try to learn Prolog once... I never did understand how it was able to make impossible leaps of deduction like that. And yet, in other, similar cases, it couldn't figure out the result.
Of course, now I know that the answer is a magic trick known as unification (although I still don't know much about that topic). Not really sure how a unifyer would help you manipulate symbolic expressions...
Magic trick?? But this is something basic. ALL comp. sci. students are obliged to learn it, and learn also the clause reduction strategy known as the resolution. It is intuitive and not complicated at all. OK, from my personal perspective. I tried some time ago to offer to high- school pupils a crash-course in Prolog, showing the non-deterministic style of programming, and showing an example of a simple-minded simplifier and symbolic differentiator. The unification was presented as a powerful pattern-matcher, being able to instanciate logic variables, and test the coherence within patterns sharing the same elements (which is not possible in Haskell, and *much* less efficient than the Haskell pattern-matcher, for formal reasons). It *WORKED*. So, either you are 85 or more years old, or, you simply never tried to use efficiently your very, very (physically) young brain. A good deal of Prolog non-determinism can be efficiently and nicely simulated in Haskell using the list Monad. Our 3-nd year CompSci students are obliged to learn it. They survive it, so can you. This is my last public answer along these lines. Have fun. Jerzy Karczmarczuk Caen, France.

On 6/2/07, jerzy.karczmarczuk@info.unicaen.fr < jerzy.karczmarczuk@info.unicaen.fr> wrote: (... and showing an example of a simple-minded simplifier
and symbolic differentiator. The unification was presented as a powerful pattern-matcher, being able to instanciate logic variables, and test the coherence within patterns sharing the same elements (which is not possible in Haskell, and *much* less efficient than the Haskell pattern-matcher, for formal reasons). It *WORKED*. ...) (... A good deal of Prolog non-determinism can be efficiently and nicely simulated in Haskell using the list Monad.
You have brought up prolog, unification, etc .. and knowing this is the Haskell board, just wondering what anyones thoughts on the hybrid haskell based language CURRY, for these kind of problems. It seems that it's development is stalled... and sorry ahead of time if I am wrong on that point. Just seems that if it were firing on all cylinders and had implemented all of the "type" magic of the Haskell implementations ..it it wouldn't be great for doing symbolic manipulations.. any comments on that?? or has anyone out there ever taken a look at Curry in any comprehensive way. Just seems really interesting and I have fired it up just enough to see that it works, but I was only doing things with it to test and they were ALL things that I brought over from Haskell... Sorry if this is sorta off topic, but... gene

You have brought up prolog, unification, etc .. and knowing this is the Haskell board, just wondering what anyones thoughts on the hybrid haskell based language CURRY, for these kind of problems. It seems that it's development is stalled... and sorry ahead of time if I am wrong on that point.
In a previous life, I was a logic programming zealot, and looked at curry. As far as I was concerned, it shared a significant problem with most logic programming work: the designers had a bit of a slack attitude to semantics. The key problem was that it had non-deterministic functions so you could write (haskell syntax): main = do if f 42 /= f 42 then putStr "Look ma, no referential transparency\n" else return () and expect the putStr could get executed. For reference, it's not a problem in Prolog (if you overlook IO being done with side effects ;-)) because the variables are explicit, and not just a notational convenience as they are in lambda calculus: main :- f(42,X), f(42,Y), .... cheers, T. -- Dr Thomas Conway drtomc@gmail.com Silence is the perfectest herald of joy: I were but little happy, if I could say how much.

Lennart Augustsson wrote:
Why do you seem so in awe of Mathematica?
Oh, well, I guess it is only the most powerful maths software ever written... no biggie.
It's just another language with a good set of libraries. Claims that it is the best, fastest, etc comes from Wolfram advertising, no doubt. :)
The claim that it is the fastest clearly doesn't hold (much to my surprise). The claim that it is the most powerful, well... I have yet to see anything that can come close to the symbolic power of Mathematica. Let's face it, the thing costs nearly ten grand for a reason...

Andrew Coppin wrote: > Lennart Augustsson wrote: >> Why do you seem so in awe of Mathematica? > > Oh, well, I guess it is only the most powerful maths software ever > written... no biggie. No, it is one of several. In very little time I can find 20 things that Maple does better than Mathematica. In the same amount of time, I can find 20 things that Mathematica does better than Maple. [Actually, the most obvious is that its marketing is miles better; so good that it makes blind evangelists out of people who have not even tried the competitors]. >> It's just another language with a good set of libraries. Claims that >> it is the best, fastest, etc comes from Wolfram advertising, no >> doubt. :) > > The claim that it is the fastest clearly doesn't hold (much to my > surprise). The claim that it is the most powerful, well... I have yet > to see anything that can come close to the symbolic power of Mathematica. Give Maple a try. For example, you'll find that: 1) Maple's DE solver beats Mathematica hands-down 2) Mathematica's definite integrator beats Maples hands-down 3) Maple's symbolic non-linear equation solver is best 4) Mathematica's definite summation (ie finding closed forms) is best and on and on. [I don't know enough about the other systems to make similar comparison lists]. You got suckered by their marketing. Get your head out of the sand, and take a good look around what is available. Jacques

Jacques Carette wrote:
Lennart Augustsson wrote:
Why do you seem so in awe of Mathematica?
Oh, well, I guess it is only the most powerful maths software ever written... no biggie. No, it is one of several. In very little time I can find 20 things
Andrew Coppin wrote: that Maple does better than Mathematica. In the same amount of time, I can find 20 things that Mathematica does better than Maple. [Actually, the most obvious is that its marketing is miles better; so good that it makes blind evangelists out of people who have not even tried the competitors].
If Wolfram want to claim that Mathematica has "redefined the face of computer science"... well I don't believe that for one second. If they want to claim it's a product that can do some amazing stuff... well I don't see much evidence to the contrary.
You got suckered by their marketing. Get your head out of the sand, and take a good look around what is available.
I looked, I didn't find anything interesting.

On Wednesday 30 May 2007 22:15:55 Andrew Coppin wrote:
Jon Harrop wrote:
I wrote a toy Mathematica implementation in OCaml while I waited to be viva'd for my PhD. It garnered so much interest that Wolfram Research bought it from me for £4,500 and gave me several free copies of Mathematica.
Are you serious?! o_O
Yes.
1. Numerical libraries: you should be able to reuse existing libraries like GMP, BLAS, LAPACK, FFTW and so on. These are often much faster than Mathematica's. For example, FFTW was about 4x faster than Mathematica's FFT the last time I benchmarked it. However, they do not support interval arithmetic.
Now this is interesting. The claim is that Mathematica is the fastest, most powerful software on planet earth, second to none.
If you write a simple, numerically-intensive program that runs in the Mathematica rewriter then its performance is about 100-1,000x slower than that of a native-code compiled language like Haskell. Mathematica is often 30x slower than interpreted OCaml bytecode. Take this ray tracer, for example: scene = {Sphere[{0., 0., 4.}, 1.], Sphere[{-1., 1., 4.}, 1.], Sphere[{-1., -1., 4.}, 1.], Sphere[{1., 1., 4.}, 1.], Sphere[{1., -1., 4.}, 1.]}; \[Delta] = Sqrt[$MachineEpsilon]; Unitise[p_] := p/Sqrt[p.p] RaySphere[o_, d_, c_, r_] := Block[{v = c - o, b = v.d, disc = b^2 - v.v + r^2}, If[disc <= 0., \[Infinity], disc = Sqrt[disc]; Block[{t2 = b + disc}, If[t2 <= 0., \[Infinity], Block[{t1 = b - disc}, If[t1 > 0., t1, t2]]]]]] Intersect[o_, d_][{lambda_, n_}, Sphere[c_, r_]] := Block[{lambda2 = RaySphere[o, d, c, r]}, If[lambda2 >= lambda, {lambda, n}, {lambda2, Unitise[o + lambda2 d - c]}] ] Intersect[o_, d_][hit_, list_List] := Fold[Intersect[o, d], hit, list] nohit = {\[Infinity], {0., 0., 0.}}; RayTrace[o_, d_, scene_] := Block[{lambda, n, g, p}, {lambda, n} = Intersect[o, d][nohit, scene]; If[lambda === \[Infinity], 0., g = n.neglight; If[g <= 0., 0., p = o + lambda d + \[Delta] n; {lambda, n} = Intersect[p, neglight][nohit, scene]; If[lambda < \[Infinity], 0., g]]]] Timing[image = Table[Table[ RayTrace[{0., 0., -2.5}, Unitise[{x, y, 128.}], scene], {y, -64, 64}], {x, -64, 64}];] This program takes 4.8s to run here. I bet if we translate it into Haskell it will run much faster than that. As a guide, this Haskell ray tracer is much more advanced and it can render a bigger (100x100) image in only 0.2s: http://www.nobugs.org/developer/htrace/ Incidentally, when I try to recompile with optimizations turned on, GHC refuses to work: $ ghc htrace.hs -o htrace $ ghc -O2 htrace.hs -o htrace compilation IS NOT required I must delete the target or edit the source to get it to recompile. I assume this is a known bug?
Actually it turns out that at least for factoring moderately big integers, pari/gp seems to be a fair bit faster (50% or so). I have no idea about the rest.
Right, but that is just calling an internal function that is written in C. Provided you only ever call a few library functions, performance will be excellent in Mathematica. But when you cannot phrase your program in terms of the built-in library functions, performance is terrible and this is when everyone reaches for a more efficient tool. To me, performance is way down on the list of things that make Mathematica great. Integrated graphics is probably the main benefit. I mostly use MMA as a glorified graph plotter.
Note that (as I understand it) GHC implements Haskell's Integer type using the GMP. And for some reason or other, they want to remove this feature...
Arbitrary precision integers are quite a performance burden and they are rarely used. I would not expect a language that is trying to be efficient to impose arbitrary precision integers (or floats).
Erm... have you seen Mathematica 6?
Yes.
That's OpenGL accelerated too.
Yes. Similarly, making graphics fast takes a lot more than just "using OpenGL".
I've just been playing with it in fact - it's pretty fast as far as I can tell=
It still invokes a completely generic term rewriter for everything it evaluates. You can really feel this when you play with some of their interactive demos. Even the simple ones are notably sluggish. Try translating the Tiger demo from our site into Mathematica, for example. Many of their demos will be trivial to write in Haskell but performance would be a lot better. I'd like to write some graphics demos in Haskell using OpenGL... -- Dr Jon D Harrop, Flying Frog Consultancy Ltd. OCaml for Scientists http://www.ffconsultancy.com/products/ocaml_for_scientists/?e

On Wed, May 30, 2007 at 11:56:30PM +0100, Jon Harrop wrote:
On Wednesday 30 May 2007 22:15:55 Andrew Coppin wrote:
Jon Harrop wrote:
I wrote a toy Mathematica implementation in OCaml while I waited to be viva'd for my PhD. It garnered so much interest that Wolfram Research bought it from me for £4,500 and gave me several free copies of Mathematica.
Are you serious?! o_O
Yes.
You said that constructing a specification is the hardest part of implementing Mathematica, and you also say you managed to clone it. Can you reveal your specification, or did WR give you a NDA? Stefan

On Thursday 31 May 2007 00:10:27 Stefan O'Rear wrote:
You said that constructing a specification is the hardest part of implementing Mathematica, and you also say you managed to clone it. Can you reveal your specification, or did WR give you a NDA?
NDA, although I did most of the reverse engineering independently beforehand. They use lots of nifty tricks (basically any trick that you can do easily in C) but there were plenty of other tricks they didn't try because they are far from obvious in C. I found an interesting alternative was to keep a lazily evaluated set of the symbols found in each subexpression. This could be used to cull searches and avoid unnecessary rewriting but also had no significant performance overhead. -- Dr Jon D Harrop, Flying Frog Consultancy Ltd. OCaml for Scientists http://www.ffconsultancy.com/products/ocaml_for_scientists/?e

On 5/30/07, Jon Harrop
Incidentally, when I try to recompile with optimizations turned on, GHC refuses to work:
$ ghc htrace.hs -o htrace $ ghc -O2 htrace.hs -o htrace compilation IS NOT required
I must delete the target or edit the source to get it to recompile. I assume this is a known bug?
If the sources haven't changed and you're only using a different combination of command-line options, GHC's recompilation checker will determine that no recompilation is necessary. You can turn off the recompilation checker and force recompilation unconditionally by adding the -no-recomp flag. (There's already a feature request to make the recompilation checker consider changes to command-line options as well as code, it just haven't been implemented.) Cheers, Tim -- Tim Chevalier * chevalier@alum.wellesley.edu * Often in error, never in doubt

| Incidentally, when I try to recompile with optimizations turned on, GHC | refuses to work: | | $ ghc htrace.hs -o htrace | $ ghc -O2 htrace.hs -o htrace | compilation IS NOT required Yes, I think it's a bug. GHC should really compare the flags used last time with the flags used this time, and recompile if they have changed. If enough people yell about it, we'll probably fix it! Meanwhile opening a Trac bug report (or perhpas feature request) would be a good start. Simon

http://hackage.haskell.org/trac/ghc/ticket/106
It got changed to Won't Fix. Consider this a yell!
On 31/05/07, Simon Peyton-Jones
| Incidentally, when I try to recompile with optimizations turned on, GHC | refuses to work: | | $ ghc htrace.hs -o htrace | $ ghc -O2 htrace.hs -o htrace | compilation IS NOT required
Yes, I think it's a bug. GHC should really compare the flags used last time with the flags used this time, and recompile if they have changed. If enough people yell about it, we'll probably fix it! Meanwhile opening a Trac bug report (or perhpas feature request) would be a good start.
Simon
_______________________________________________ Haskell-Cafe mailing list Haskell-Cafe@haskell.org http://www.haskell.org/mailman/listinfo/haskell-cafe

On Thu, 2007-05-31 at 08:46 +0100, Simon Peyton-Jones wrote:
| $ ghc htrace.hs -o htrace | $ ghc -O2 htrace.hs -o htrace | compilation IS NOT required
Yes, I think it's a bug. GHC should really compare the flags used last time with the flags used this time [...]
As an (easier) alternative, I would find it in line with my expectations if GHC always recompiled in the absence of --make, and recompiled based on time stamps in the presence of --make. -k

Jon Harrop wrote:
On Wednesday 30 May 2007 22:15:55 Andrew Coppin wrote:
Note that (as I understand it) GHC implements Haskell's Integer type using the GMP. And for some reason or other, they want to remove this feature...
Arbitrary precision integers are quite a performance burden and they are rarely used. I would not expect a language that is trying to be efficient to impose arbitrary precision integers (or floats).
In Haskell, Int gives you the standard signed, fixed size integer for your machine, and Integer gives arbitrary precision integers. Int8, Int16, ... provide signed ints of a known size, and Word8, Word16 give the unsigned. They are all instances of Num, so integer literals will be whatever type is needed.

Jon Harrop wrote:
Arbitrary precision integers are quite a performance burden and they are rarely used. I would not expect a language that is trying to be efficient to impose arbitrary precision integers (or floats).
Apparently you have looked inside a computer *algebra* system, one that does *exact* computations (i.e. on polynomials with exact coefficients, not floats/doubles). Arbitrary precision integers are not a 'burden', they are an absolute necessity. Algorithms on polynomials will almost inevitably produce 'coefficient growth'. Even things as simple as sub-resultant computations (for computing extended GCDs) have this problem. And this is not a fluke or a problem with state-of-the-art, there are known cases where this is inevitable. Like Jerzy, I wonder why I get suckered in to these conversations. I guess we both have this silly need to set the record straight. Jacques

Jon Harrop wrote:
If you write a simple, numerically-intensive program that runs in the Mathematica rewriter then its performance is about 100-1,000x slower than that of a native-code compiled language like Haskell. Mathematica is often 30x slower than interpreted OCaml bytecode.
Is this before or after compiling the Mathematica code?
Incidentally, when I try to recompile with optimizations turned on, GHC refuses to work:
$ ghc htrace.hs -o htrace $ ghc -O2 htrace.hs -o htrace compilation IS NOT required
I must delete the target or edit the source to get it to recompile. I assume this is a known bug?
This one surprised me... I'm pretty sure I tried recompiling with -O2 and it recompiled everything. Maybe I imagined it?
Right, but that is just calling an internal function that is written in C. Provided you only ever call a few library functions, performance will be excellent in Mathematica. But when you cannot phrase your program in terms of the built-in library functions, performance is terrible and this is when everyone reaches for a more efficient tool.
I don't know, I thought one of the big advantages of Mathematica was supposed to be that it can transform problems into the most efficiently solvable form, select between multiple algorithms, etc.
To me, performance is way down on the list of things that make Mathematica great. Integrated graphics is probably the main benefit. I mostly use MMA as a glorified graph plotter.
And I mostly use it to do insane things like give me parametric solutions to giant polynomials and so forth... ;-)

On Thursday 31 May 2007 20:56:47 Andrew Coppin wrote:
Jon Harrop wrote:
If you write a simple, numerically-intensive program that runs in the Mathematica rewriter then its performance is about 100-1,000x slower than that of a native-code compiled language like Haskell. Mathematica is often 30x slower than interpreted OCaml bytecode.
Is this before or after compiling the Mathematica code?
The 100-1,000x slower is without compilation in Mathematica. I found the ray tracer to be 30x slower in compiled Mathematica compared to OCaml bytecode (OCaml native code is much faster still). So GHC will easily beat Mathematica on such tasks. Incidentally, once you've reimplemented the core Mathematica language in 4 days, you might like to reimplement their compiler. I believe their spec is freely available but you could just redo the whole thing yourself. You could probably extend it and optimise it quite easily. For example, it doesn't support recursion and I don't think it does type inference.
I don't know, I thought one of the big advantages of Mathematica was supposed to be that it can transform problems into the most efficiently solvable form, select between multiple algorithms, etc.
On certain very specific problems, yes. So it might check to see if a matrix is symmetric and use a faster routine when possible. As far as the languange itself is concerned, it is fairly mundane.
And I mostly use it to do insane things like give me parametric solutions to giant polynomials and so forth... ;-)
I used it to take integrals, do lots of FFTs, some general data analysis and lots of graph plotting. There are plenty of other computer algebra packages that could handle the integrals (although I never found any were better than just doing it by hand) and general-purpose languages that are better suited to the analysis, so now I just use it as a graph plotter. -- Dr Jon D Harrop, Flying Frog Consultancy Ltd. OCaml for Scientists http://www.ffconsultancy.com/products/ocaml_for_scientists/?e

On Wednesday 30 May 2007 07:04:31 Jon Harrop wrote:
3. The language: the hardest part of reimplementing Mathematica is inferring what it means (there are no formal evaluation semantics). Once you've done that it is just a case of implementing an extensible term rewriter and putting in about 20 core rules. The pattern matcher is well under 100 LOC and you can do various things to make it more efficient. There are two tricks that vastly improve performance of the rewriter: return physically identical results whenever possible, and perform substitution and evaluation at the same time rather than as two separate passes.
Sorry for replying to myself. :-) It occurs to me that laziness will eliminate the intermediate data structure between substitution and evaluation anyway, so that isn't such a concern in Haskell. However, I can't think how you might return physically identical results when possible in Haskell. Essentially, you need a higher-order map function: val id_map : ('a -> 'a) -> 'a t -> 'a t that returns its input when "f x = x" for every x. How might this be done? -- Dr Jon D Harrop, Flying Frog Consultancy Ltd. OCaml for Scientists http://www.ffconsultancy.com/products/ocaml_for_scientists/?e

Jon,
However, I can't think how you might return physically identical results when possible in Haskell. Essentially, you need a higher-order map function:
val id_map : ('a -> 'a) -> 'a t -> 'a t
that returns its input when "f x = x" for every x. How might this be done?
fmap :: (Functor f) => (a -> b) -> f a -> f b Cheers, Stefan

Jon Harrop wrote:
However, I can't think how you might return physically identical results when possible in Haskell.
Perhaps you might be interested then in the following function that non-destructively updates a subterm in a large term, preserving sharing. The function can be used to do a substitution in a term. The function is described in http://okmij.org/ftp/Haskell/Zipper2.lhs beginning with the phrase ``We start therefore with an improved term enumerator that maximally preserves sharing. If no updates were done, the result of the traversal is the original term itself -- rather than its copy. Furthermore, this property holds for each subterm. The new traversal function lets us operate on subterms in pre-order, in-order, or post-order. More importantly, it lets us effectively span a `canonical' spanning tree on a term, so each node can be unambiguously identified. We do not require equality on (sub)terms.'' That was the second article in a series; please see http://okmij.org/ftp/Computation/Continuations.html#zipper for the full series.
participants (17)
-
Al Falloon
-
Alex Queiroz
-
Andrew Coppin
-
Gene A
-
Jacques Carette
-
jerzy.karczmarczuk@info.unicaen.fr
-
Jon Harrop
-
Ketil Malde
-
Lennart Augustsson
-
oleg@pobox.com
-
Rodrigo Queiro
-
Simon Peyton-Jones
-
Stefan Holdermans
-
Stefan O'Rear
-
Thomas Conway
-
Tim Chevalier
-
Tim Newsham