
On Sat, Jul 04, 2009 at 01:08:41AM +0200, Henning Thielemann wrote:
Ross Paterson schrieb:
On Tue, Jun 30, 2009 at 01:37:05PM +0200, Henning Thielemann wrote:
This sounds like a rather ad-hoc extension of the already complicated hierarchy of those category classes. Are there particular examples where the specialisation is needed and where it cannot be done by optimizer rules?
Here's one for (<$). In Data.Sequence, I could define
x <$ s = replicate (size s) x
(using Louis Wasserman's replicate), which would take O(log n) time and space, a big improvement over the O(n) version using const and fmap.
Would it be reasonable to let the optimizer replace (x <$ s) by (replicate (size s) x) via RULES?
I don't like using RULES for optimizations that actually change the computational or space complexity of code. The reason being that often the complexity determines whether an algorithm works at all (like whether frisby requires O(n) space or O(1) space, a fundamental difference in functionality). We need control of what algorithms are used, not to rely on an compiler specific extension that may or may not get applied properly. In addition, RULES won't apply to polymorphic functions written on top of the existing ones that call class methods. RULES only apply to the top level interface when they do apply. instance definitions will always be used. John -- John Meacham - ⑆repetae.net⑆john⑈ - http://notanumber.net/