
Manlio Perillo wrote:
But this may be really a question of personal taste or experience. What is more "natural"?
1) pattern matching 2) recursion or 1) function composition 2) high level functions
Which is more "natural": * C-style for-loops (aka assembly while-loops), or * any modern language's foreach loops (aka iterators)? Following directly from the Rule of Least Power, if you can get away with foreach then that's what you should use. Why? Because the less power the construct has, the fewer corner cases and generalizations a reader of the code needs to consider. Now, just because iterators exist does not mean that one should never use the more general tool. If you're fighting to break out of your chosen straitjacket, then chances are it's the wrong one to use in the first place; it'd be clearer to use more power and have less fighting. Both of these conclusions seem quite natural to me, even from before learning Haskell. It seems, therefore, that "naturality" is not the proper metric to discuss. It's oft overlooked, but the fact is that expressivity comes not from more formal power, but from _less_. * A human's (or any vertebrate's) range of motion is severely crippled when compared to that of an amoeba; and yet it is those limitations which provide the structure necessary to perform greater tasks such as grasping, lifting, jumping, etc. * Natural language has a limited range of words and syntactic constructs, but gives the larger-enough building blocks to enable unconstrained communication; whereas a language with a unique word for every utterance (arguably simpler) is impossible to learn. * Regular expressions (and other classes of automata) have severe limitations on formal power, and yet these constraints enable poly-time algorithms for intersection, union, etc. * Haskell's type system (sans extensions) is not Turing complete, yet this enables us to infer types rather than requiring annotations or proofs. The contemporary state of scientific research is focused heavily on the idea of reductionism (the idea of being able to reduce all biology to chemistry, all chemistry to physics, all computer science to mathematics, etc). But as any systems theorist will tell you, this approach is misguided if the goal is a Theory of Everything. As per the famous book: no matter how much you learn about quarks, that tells you nothing about jaguars. At every step of reduction, there is an increase in formal power and a concomitant loss of information. Even perfect knowledge of quarks and perfect simulation software isn't enough, because you've lost the _abstraction_ that is "jaguar". You can simulate it, emulate it, model it, but you've lost the high-level perspective that says jaguars are different and more interesting than an arbitrary simulation of a collection of quarks. (And it's doubtful we'll ever have the omniscience to get even that far.) While primitive recursion and case matching are _fundamental_ (that is, at the bottom of a reductionist tower), that does not entail that they are _central_ (that is, a ubiquitous pattern at every resolution of reduction). Church encoding, SKI combinators, Curry-Howard isomorphism, and the like are also fundamental topics to teach and understand; but they're rarely ones that should be central to a program or library. Now, many Haskellers (like good scientists) bristle at this fundamental nature of things. And in response we're constantly coming up with new generalizations which have little-enough structure to be descriptive while having big-enough structure to be interesting. If there's too much structure, it's boilerplate and therefore unusable; if there's too little, it has no generality and is therefore unhelpful. But somewhere between those extremes someone has to make a judgment call and decide whether some particular pattern measures up to the metric of being helpful and usable. If it does, then everyone (whose domain it covers) should learn it and use it because it simplifies programming from a high-level of design. Giants. Shoulders. Etc. -- Live well, ~wren