Re: Haskell-Cafe Digest, Vol 53, Issue 2

The normal view taken by Haskellers is that the denotations of
Haskell types are CPPOs.
CPPO?
So:
(1) Must be monotone (2) Must be continuous
Could you please define what you mean by those terms in this context?
(Needn't be strict, even though that messes up the resulting category substantially).
I'm not convinced that the category is all that "messed up".
The extra P would stand for "pointed" (has a least element, bottom), this is common in some communities. To me though, a cpo (complete partial order) is closed under directed suprema and the empty set is directed so bottom is already required. The category of cpos in not cartesian closed. For denotational semantics I believe the subcategory of Scott domains are what is usually considered.
Continuous functions on cpos are by definition monotone and they respect directed suprema. Jens

On 3 Jan 2008, at 3:40 AM, Jens Blanck wrote:
The normal view taken by Haskellers is that the denotations of Haskell types are CPPOs.
CPPO?
So:
(1) Must be monotone (2) Must be continuous
Could you please define what you mean by those terms in this context?
(Needn't be strict, even though that messes up the resulting category substantially).
I'm not convinced that the category is all that "messed up".
Well, no coproducts (Haskell uses a lifted version of the coproduct from CPO). Of course, Haskell makes things even worse by lifting the product and exponential objects, as well, which come to think of it is unnecessary even in the category of CPPOs and not necessarily strict continuous functions. The extra P would stand for "pointed" (has a least element, bottom), this is common in some communities. To me though, a cpo (complete partial order) is closed under directed suprema and the empty set is directed so bottom is already required. Not so. A set is directed iff every finite subset has an upper bound in the set; {} is finite, so it must have an upper bound in the set. So directed sets must be non-empty. (So CPOs needn't be pointed). jcc

Jonathan Cast wrote:
The normal view taken by Haskellers is that the denotations of Haskell types are CPPOs. So: (1) Must be monotone (2) Must be continuous (Needn't be strict, even though that messes up the resulting category substantially).
I wrote:
I'm not convinced that the category is all that "messed up".
Well, no coproducts (Haskell uses a lifted version of the coproduct from CPO).
What goes wrong with finite coproducts? The obvious thing to do would be to take the disjoint union of the sets representing the types, identifying the copies of _|_. What is the lifted version you are referring to?
Of course, Haskell makes things even worse by lifting the product and exponential objects,
OK, what goes wrong there, and what is the lifting? Thanks, Yitz

On 5 Jan 2008, at 6:03 PM, Yitzchak Gale wrote:
Jonathan Cast wrote:
The normal view taken by Haskellers is that the denotations of Haskell types are CPPOs. So: (1) Must be monotone (2) Must be continuous (Needn't be strict, even though that messes up the resulting category substantially).
I wrote:
I'm not convinced that the category is all that "messed up".
Well, no coproducts (Haskell uses a lifted version of the coproduct from CPO).
What goes wrong with finite coproducts? The obvious thing to do would be to take the disjoint union of the sets representing the types, identifying the copies of _|_.
This isn't a coproduct. If we have f x = 1 and g y = 2, then there should exist a function h such that h . Left = f and h . Right = g, i.e., h (Left x) = f x = 1 and h (Right y) = g y = 2 But by your rule, Left undefined = Right undefined, so 1 = h (Left undefined) = h (Right undefined) = 2 Which is a contradiction. Identifying Left _|_ and Right _|_ produces a pointed CPO, but it's not a coproduct. Unless, of course, your require your functions to be strict --- then both f and g above become illegal, and repairing them removes the problem.
What is the lifted version you are referring to?
Take the ordinary disjoint union, and then add a new _|_ element, distinct from both existing copies of _|_ (which are still distinct from each other).
Of course, Haskell makes things even worse by lifting the product and exponential objects,
OK, what goes wrong there, and what is the lifting?
Again, in Haskell, (_|_, _|_) /= _|_. The difference is in the function f (x, y) = 1 which gives f undefined = undefined but f (undefined, undefined) = 1. Unfortunately, this means that (alpha, beta) has an extra _|_ element < (_|_, _|_), so it's not the categorical product (which would lack such an element for the same reason as above). This is partly an implementation issue --- compiling pattern matching without introducing such a lifting requires a parallel implementation --- and partly a semantic issue --- data introduces a single level of lifting, every time it is used, and every constructor is completely lazy. Functions have the same issue --- in the presence of seq, undefined / = const undefined. Extensionality is a key part of the definition of all of these constructions. The categorical rules are designed to require, in concrete categories, that the range of the two injections into a coproduct form a partition of the coproduct, the surjective pairing law (fst x, snd x) = x holds, and the eta reduction law (\ x -> f x) = f holds. Haskell flaunts all three; while some categories have few enough morphisms to get away with this (at least some times), Hask is not one of them. jcc

I wrote:
What goes wrong with finite coproducts? The obvious thing to do would be to take the disjoint union of the sets representing the types, identifying the copies of _|_.
Jonathan Cast wrote:
This isn't a coproduct. If we have f x = 1 and g y = 2, then there should exist a function h such that h . Left = f and h . Right = g... But by your rule, Left undefined = Right undefined... Which is a contradiction...
OK, thanks.
Unless, of course, your require your functions to be strict --- then both f and g above become illegal, and repairing them removes the problem.
You don't have to make them illegal - just not part of your notion of "coproduct". That is an entirely category-theoretic concept, since Empty is bi-universal, and a morphism is strict iff the diagram f A ---> B \ ^ v / Empty commutes. However, the coproduct you get is the one I suggested, namely Either !A !B, not the one we usually use.
What is the lifted version you are referring to?
Take the ordinary disjoint union, and then add a new _|_ element, distinct from both existing copies of _|_ (which are still distinct from each other).
Of course, Haskell makes things even worse by lifting the product and exponential objects,
OK, what goes wrong there, and what is the lifting?
Again, in Haskell, (_|_, _|_) /= _|_. The difference is in the function
f (x, y) = 1
which gives f undefined = undefined but f (undefined, undefined) = 1. Unfortunately, this means that (alpha, beta) has an extra _|_ element < (_|_, _|_), so it's not the categorical product (which would lack such an element for the same reason as above). This is partly an implementation issue --- compiling pattern matching without introducing such a lifting requires a parallel implementation --- and partly a semantic issue --- data introduces a single level of lifting, every time it is used, and every constructor is completely lazy.
Functions have the same issue --- in the presence of seq, undefined / = const undefined.
Extensionality is a key part of the definition of all of these constructions. The categorical rules are designed to require, in concrete categories, that the range of the two injections into a coproduct form a partition of the coproduct, the surjective pairing law (fst x, snd x) = x holds, and the eta reduction law (\ x -> f x) = f holds. Haskell flaunts all three; while some categories have few enough morphisms to get away with this (at least some times), Hask is not one of them.
jcc

(sorry, I hit the send button)
What is the lifted version you are referring to?
Take the ordinary disjoint union, and then add a new _|_ element, distinct from both existing copies of _|_ (which are still distinct from each other).
Now why is that not the category-theoretic coproduct? h . Left = f and h . Right = g both for _|_ and for finite elements of the types. And it looks universal to me.
Of course, Haskell makes things even worse by lifting the product and exponential objects,
OK, what goes wrong there, and what is the lifting?
Again, in Haskell, (_|_, _|_) /= _|_. The difference is in the function f (x, y) = 1 which gives f undefined = undefined but f (undefined, undefined) = 1.
I don't get that one. Given any f and g, we define h x = (f x, g x). Why do we not have fst . h = f and snd . h = g, both in Hask and StrictHask? fst and snd are strict.
Unfortunately, this means that (alpha, beta) has an extra _|_ element < (_|_, _|_), so it's not the categorical product (which would lack such an element for the same reason as above).
The reason you can't adjoin an extra element to (A,B) in, say, Set, is that you would have nowhere to map it under fst and snd. But here that is not a problem, _|_ goes to _|_ under both.
This is partly an implementation issue --- compiling pattern matching without introducing such a lifting requires a parallel implementation
That's interesting. So given a future platform where parallelizing is much cheaper than it is today, we could conceivably have a totally lazy version of Haskell. I wonder what it would be like to program in that environment, what new problems would arise and what new techniques would solve them. Sounds like a nice research topic. Who is working on it?
--- and partly a semantic issue --- data introduces a single level of lifting, every time it is used, and every constructor is completely lazy.
Unless you use bangs. So both options are available, and that essentially is what defines Haskell as being a non-strict language.
Functions have the same issue --- in the presence of seq, undefined / = const undefined.
Right. I am becoming increasingly convinced that the seq issue is a red herring.
Extensionality is a key part of the definition of all of these constructions. The categorical rules are designed to require, in concrete categories, that the range of the two injections into a coproduct form a partition of the coproduct, the surjective pairing law (fst x, snd x) = x holds, and the eta reduction law (\ x -> f x) = f holds. Haskell flaunts all three; while some categories have few enough morphisms to get away with this (at least some times), Hask is not one of them.
That interpretation is not something that is essential in the notion of category, only in certain specific examples of categories that you know. Product and coproduct in any given category - whether they exist, what they are like if they exist, and what alternative constructions exist if they do not - reflect the nature of the structure that the category is modeling. I am interested in understanding the category Hask that represents what Haskell really is like as a programming language. Not under some semantic mapping that includes non-computable functions, and that forgets some of the interesting structure that comes from laziness (though that is undoubtably also very interesting). -Yitz

Take the ordinary disjoint union, and then add a new _|_ element, distinct from both existing copies of _|_ (which are still distinct from each other).
Now why is that not the category-theoretic coproduct? h . Left = f and h . Right = g both for _|_ and for finite elements of the types. And it looks universal to me.
Yeah, but there could be more functions from Either X Y to Z than pairs of functions from X to Z and from Y to Z. For example, if z :: Z, then you have two functions h1 and h2 such that h1 . Left = const z and h1 . Right = const z and the same holds for h2. Namely, h1 = const z h2 = either (const z) (const z) This functions are different : h1 (_|_) = z while h2 (_|_) = (_|_). And if Either X Y was a category-theoretic coproduct, then the function from Either X Y to Z would be UNIQUELY determined by it's restrictions to X and Y.

On 6 Jan 2008, at 5:32 AM, Yitzchak Gale wrote:
(sorry, I hit the send button)
What is the lifted version you are referring to?
Take the ordinary disjoint union, and then add a new _|_ element, distinct from both existing copies of _|_ (which are still distinct from each other).
Now why is that not the category-theoretic coproduct? h . Left = f and h . Right = g both for _|_ and for finite elements of the types. And it looks universal to me.
Not quite. The only requirement for h _|_ is that it be <= f x for all x and <= g y for all y. If f x = 1 = g y for all x, y, then both h _|_ = _|_ and h _|_ = 1 are arrows of the category. So the universal property still fails.
Of course, Haskell makes things even worse by lifting the product and exponential objects,
OK, what goes wrong there, and what is the lifting?
Again, in Haskell, (_|_, _|_) /= _|_. The difference is in the function f (x, y) = 1 which gives f undefined = undefined but f (undefined, undefined) = 1.
I don't get that one. Given any f and g, we define h x = (f x, g x). Why do we not have fst . h = f and snd . h = g, both in Hask and StrictHask? fst and snd are strict.
Again, if f and g are both strict, we have a choice for h _|_ --- either h _|_ = _|_ or h _|_ = (_|_, _|_) will work (fst . h = f and snd . h = g), but again these are different morphisms.
Unfortunately, this means that (alpha, beta) has an extra _|_ element < (_|_, _|_), so it's not the categorical product (which would lack such an element for the same reason as above).
The reason you can't adjoin an extra element to (A,B) in, say, Set, is that you would have nowhere to map it under fst and snd. But here that is not a problem, _|_ goes to _|_ under both.
This is partly an implementation issue --- compiling pattern matching without introducing such a lifting requires a parallel implementation
That's interesting. So given a future platform where parallelizing is much cheaper than it is today, we could conceivably have a totally lazy version of Haskell. I wonder what it would be like to program in that environment, what new problems would arise and what new techniques would solve them. Sounds like a nice research topic. Who is working on it?
--- and partly a semantic issue --- data introduces a single level of lifting, every time it is used, and every constructor is completely lazy.
Unless you use bangs. So both options are available, and that essentially is what defines Haskell as being a non-strict language.
(!alpha, !beta) isn't a categorical product, either. snd (undefined, 1) = undefined with this type.
Functions have the same issue --- in the presence of seq, undefined / = const undefined.
Right. I am becoming increasingly convinced that the seq issue is a red herring.
Care to give an explanation?
Extensionality is a key part of the definition of all of these constructions. The categorical rules are designed to require, in concrete categories, that the range of the two injections into a coproduct form a partition of the coproduct, the surjective pairing law (fst x, snd x) = x holds, and the eta reduction law (\ x -> f x) = f holds. Haskell flaunts all three; while some categories have few enough morphisms to get away with this (at least some times), Hask is not one of them.
That interpretation is not something that is essential in the notion of category, only in certain specific examples of categories that you know.
I understand category theory. I also know that the definitions used are chosen to get Set `right', which means extensionality in that case, and are then extended uniformly across all categories. This has the effect of requiring similar constructions to those in Set in other concrete categories.
Product and coproduct in any given category - whether they exist, what they are like if they exist, and what alternative constructions exist if they do not - reflect the nature of the structure that the category is modeling.
I understand that. I'm not sure you do.
I am interested in understanding the category Hask that represents what Haskell really is like as a programming language.
Good luck.
Not under some semantic mapping that includes non-computable functions, and that forgets some of the interesting structure that comes from laziness (though that is undoubtably also very interesting).
Bear in mind in your quest, that at the end of it you'll most likely conclude, like everyone else, that good old equational reasoning is sound for the programs you actually right at least 90% of the time (with appropriate induction principles), and complete for at least 90% of what you want to right, and go back to using it exclusively for real programming. jcc

Jonathan Cast
Extensionality is a key part of the definition of all of these constructions. The categorical rules are designed to require, in concrete categories, that the range of the two injections into a coproduct form a partition of the coproduct, the surjective pairing law (fst x, snd x) = x holds, and the eta reduction law (\ x -> f x) = f holds. Haskell flaunts all three; while some categories have few enough morphisms to get away with this (at least some times), Hask is not one of them.
That interpretation is not something that is essential in the notion of category, only in certain specific examples of categories that you know.
I understand category theory. I also know that the definitions used are chosen to get Set `right', which means extensionality in that case, and are then extended uniformly across all categories. This has the effect of requiring similar constructions to those in Set in other concrete categories.
Referring to my copy of "Sheaves in Geometry and Logic", Moerdijk and Mac Lane state that "in 1963 Lawvere embarked on the daring project of a purely categorical foundation of all mathematics". Did he fail? I'm probably misunderstanding what you are saying here but I didn't think you needed sets to define categories; in fact Set is a topos which has far more structure than a category. Can you be clearer what you mean by extensionality in this context? And how it relates to Set? On a broader note, I'm pleased that this discussion is taking place and I wish someone would actually write a wiki page on why Haskell isn't a nicely behaved category and what problems this causes / doesn't cause. I wish I had time. Dominic.

On 6 Jan 2008, at 12:27 PM, Dominic Steinitz wrote:
Jonathan Cast
writes: Extensionality is a key part of the definition of all of these constructions. The categorical rules are designed to require, in concrete categories, that the range of the two injections into a coproduct form a partition of the coproduct, the surjective pairing law (fst x, snd x) = x holds, and the eta reduction law (\ x -> f x) = f holds. Haskell flaunts all three; while some categories have few enough morphisms to get away with this (at least some times), Hask is not one of them.
That interpretation is not something that is essential in the notion of category, only in certain specific examples of categories that you know.
I understand category theory. I also know that the definitions used are chosen to get Set `right', which means extensionality in that case, and are then extended uniformly across all categories. This has the effect of requiring similar constructions to those in Set in other concrete categories.
Referring to my copy of "Sheaves in Geometry and Logic", Moerdijk and Mac Lane state that "in 1963 Lawvere embarked on the daring project of a purely categorical foundation of all mathematics". Did he fail? I'm probably misunderstanding what you are saying here but I didn't think you needed sets to define categories;
Right. But category theory is nevertheless `backward compatible' with set theory, in the sense that the category theoretic constructions in a category satisfying ZFC will be the same constructions we are familiar with already. The category-theoretic definitions, when specialized to Set, are precise (up to natural isomorphism) definitions of the pre-existing concepts of cartesian products, functions, etc. in Set. Or, to put it another way, the category-theoretic definitions are generalizations of those pre- existing concepts to other categories. Hask has a structure that is Set-like enough that these concepts generalize very little when moving to Hask.
in fact Set is a topos which has far more structure than a category. Can you be clearer what you mean by extensionality in this context?
By `extensionality' I mean the equalities which follow from using standard set-theoretic definitions for functions, products, coproducts, etc. --- surjective pairing, eta-contraction, etc. My understanding is that, in fact, the category-theoretic definitions are designed to capture those equations in diagrams that can be used as definitions in arbitrary categories. It's possible to view those definitions, then, as more fundamental descriptions of the concepts than what they generalize, but the fact that they are generalizations of the ideas from Set shows up in categories similar to Set (and Hask is certainly more similar to Set than, say, Vec). jcc

On 6 Jan 2008, at 3:55 AM, Yitzchak Gale wrote:
I wrote:
What goes wrong with finite coproducts? The obvious thing to do would be to take the disjoint union of the sets representing the types, identifying the copies of _|_.
Jonathan Cast wrote:
This isn't a coproduct. If we have f x = 1 and g y = 2, then there should exist a function h such that h . Left = f and h . Right = g... But by your rule, Left undefined = Right undefined... Which is a contradiction...
OK, thanks.
Unless, of course, your require your functions to be strict --- then both f and g above become illegal, and repairing them removes the problem.
You don't have to make them illegal - just not part of your notion of "coproduct".
We're talking past each other --- what is the distinction you are making? jcc
participants (5)
-
Dominic Steinitz
-
Jens Blanck
-
Jonathan Cast
-
Miguel Mitrofanov
-
Yitzchak Gale