
On 2008-04-03, Chris Smith
Hans Aberg wrote:
This problem is not caused by defining f+g, but by defining numerals as constants.
Yup. So the current (Num thing) is basically:
1. The type thing is a ring 2. ... with signs and absolute values 3. ... along with a natural homomorphism from Z into thing 4. ... and with Eq and Show.
If one wanted to be perfectly formally correct, then each of 2-4 could be split out of Num. For example, 2 doesn't make sense for polynomials or n by n square matrices. 4 doesn't make sense for functions. 3 doesn't make sense for square matrices of dimension greater than 1. And, this quirk about 2(x+y) can be seen as an argument for not wanting it in the case of functions, either. I'm not sure I find the argument terribly compelling, but it is there anyway.
Just a nit, but 3 seems to make perfect sense for square matrices -- n gets mapped onto I_d for any dimension d. fromInteger (n*m) == fromInteger n * fromInteger m fromInteger (n+m) == fromInteger n + fromInteger m -- Aaron Denney -><-