deepseq: instance NFData (a -> b)

According to Haddock comments, between deepseq-1.2 and deepseq-1.3 an instance for NFData on functions was introduced without previous discussion. I find this instance pretty problematic since it has no superclasses. The correct instance would be certainly something like instance (Enumerate a, NFData b) => NFData (a -> b) where Enumerate would be a new class that allows to enumerate all values of a type. This would be hardly useful because it is pretty inefficient. I'd prefer that the instance is removed, again, or even better, be replaced by a non-implementable instance. Alternatively we should replace it by a correct implementation with corresponding superclasses. If we do the second, then we could still omit the Enumerate instance for types where enumeration of all values of the type is too expensive. I assume that the instance was added to simplify automatic derivation of NFData instances. However, I think it would be better if people insert custom deepseq implementation for the expected functions then.

+1
This instance doesn't make much sense (to me at least) and is pretty
problematic for apps that use NFData constraints as evidence that
values are ground and fully evaluated (hence don't capture arbitrary
resources). In HaskellR we use NFData constraints to make sure only
fully evaluated ground data escapes the scope of a region. But
functions are not a first-order values. They can leak arbitrary
resources out of the scope of a region.
--
Mathieu Boespflug
Founder at http://tweag.io.
On 1 May 2016 at 16:38, Henning Thielemann
According to Haddock comments, between deepseq-1.2 and deepseq-1.3 an instance for NFData on functions was introduced without previous discussion. I find this instance pretty problematic since it has no superclasses. The correct instance would be certainly something like
instance (Enumerate a, NFData b) => NFData (a -> b)
where Enumerate would be a new class that allows to enumerate all values of a type. This would be hardly useful because it is pretty inefficient. I'd prefer that the instance is removed, again, or even better, be replaced by a non-implementable instance. Alternatively we should replace it by a correct implementation with corresponding superclasses. If we do the second, then we could still omit the Enumerate instance for types where enumeration of all values of the type is too expensive.
I assume that the instance was added to simplify automatic derivation of NFData instances. However, I think it would be better if people insert custom deepseq implementation for the expected functions then. _______________________________________________ Libraries mailing list Libraries@haskell.org http://mail.haskell.org/cgi-bin/mailman/listinfo/libraries

On 2016-05-01 at 16:51:45 +0200, Boespflug, Mathieu wrote: [...]
This instance doesn't make much sense (to me at least) and is pretty problematic for apps that use NFData constraints as evidence that values are ground and fully evaluated (hence don't capture arbitrary resources). In HaskellR we use NFData constraints to make sure only fully evaluated ground data escapes the scope of a region. But functions are not a first-order values. They can leak arbitrary resources out of the scope of a region.
Are the recently added NFData instances for IORef/MVar problematic as well? -- hvr

On 1 May 2016 at 17:00, Herbert Valerio Riedel
On 2016-05-01 at 16:51:45 +0200, Boespflug, Mathieu wrote:
[...]
This instance doesn't make much sense (to me at least) and is pretty problematic for apps that use NFData constraints as evidence that values are ground and fully evaluated (hence don't capture arbitrary resources). In HaskellR we use NFData constraints to make sure only fully evaluated ground data escapes the scope of a region. But functions are not a first-order values. They can leak arbitrary resources out of the scope of a region.
Are the recently added NFData instances for IORef/MVar problematic as well?
I guess so, yes! These 3 instances, along with the STRef one, strike me as antithetical to the purpose stated at the top of the DeepSeq module: "A typical use is to prevent resource leaks in lazy IO programs, [...]. Another common use is to ensure any exceptions hidden within lazy fields of a data structure do not leak outside the scope of the exception handler, or to force evaluation of a data structure in one thread, before passing to another thread (preventing work moving to the wrong threads)." Or perhaps - we should at the very clarify and characterize precisely what invariants NFData instances are or are not expected to enforce?

I say no. An `IORef` is just a pointer, and forcing one to NF means
quite simply that you determine exactly where it points, which is
completely different from forcing whatever happens to be on the other
end. A function seems much more problematic.
On Sun, May 1, 2016 at 11:00 AM, Herbert Valerio Riedel
On 2016-05-01 at 16:51:45 +0200, Boespflug, Mathieu wrote:
[...]
This instance doesn't make much sense (to me at least) and is pretty problematic for apps that use NFData constraints as evidence that values are ground and fully evaluated (hence don't capture arbitrary resources). In HaskellR we use NFData constraints to make sure only fully evaluated ground data escapes the scope of a region. But functions are not a first-order values. They can leak arbitrary resources out of the scope of a region.
Are the recently added NFData instances for IORef/MVar problematic as well?
-- hvr _______________________________________________ Libraries mailing list Libraries@haskell.org http://mail.haskell.org/cgi-bin/mailman/listinfo/libraries

It depends on what you mean when you say a function is in normal form.
Unfortunately the instance doesn't capture either of the definitions
that immediately come to my mind:
- All values it returns are in normal form? This requires an NFData b
constraint.
- The body of the function is in normal form? This requires HNF, which
we don't have in Haskell. I guess the current instance is going for
this, but approximates HNF with WHNF.
I'm +1 to removing this, simply because there are at least three
(including yours) reasonable definitions. If someone wants an NFData
instance for a value which includes functions, they should know exactly
how they want to handle that.
On Sun, 1 May 2016 16:38:07 +0200 (CEST)
Henning Thielemann
According to Haddock comments, between deepseq-1.2 and deepseq-1.3 an instance for NFData on functions was introduced without previous discussion. I find this instance pretty problematic since it has no superclasses. The correct instance would be certainly something like
instance (Enumerate a, NFData b) => NFData (a -> b)
where Enumerate would be a new class that allows to enumerate all values of a type. This would be hardly useful because it is pretty inefficient. I'd prefer that the instance is removed, again, or even better, be replaced by a non-implementable instance. Alternatively we should replace it by a correct implementation with corresponding superclasses. If we do the second, then we could still omit the Enumerate instance for types where enumeration of all values of the type is too expensive.
I assume that the instance was added to simplify automatic derivation of NFData instances. However, I think it would be better if people insert custom deepseq implementation for the expected functions then. _______________________________________________ Libraries mailing list Libraries@haskell.org http://mail.haskell.org/cgi-bin/mailman/listinfo/libraries
-- Michael Walker (http://www.barrucadu.co.uk)

On Sun, May 1, 2016 at 10:58 AM, Michael Walker
It depends on what you mean when you say a function is in normal form. Unfortunately the instance doesn't capture either of the definitions that immediately come to my mind:
- All values it returns are in normal form? This requires an NFData b constraint.
Even when talking about term rewriting, reducing an application of two normal forms does not necessarily yield a normal form directly.
- The body of the function is in normal form? This requires HNF, which we don't have in Haskell. I guess the current instance is going for this, but approximates HNF with WHNF.
Requiring the body to be in normal form would actually be normal form. Head normal form is actually a weaker condition, where the body must not be a redex. But the distinction is rather academic, because neither of these can easily be achieved in GHC. Anyhow, I would suggest not getting hung up on what 'normal form' means, because it is actually just a bad name for what is going on once functions are involved. Really, that's why the class is named `NFData` in my mind, because talking about what it does as being the 'normal form' only really makes sense when you're talking about pure, sum-of-products algebraic data, and functions are not that. The more important question is, what is desirable behavior, and why? Why would enumerating all possible results of a function and deep seqing them be the desired behavior of deep seqing a function? It doesn't necessarily, for instance, have the sort of, 'pull everything into memory to free another scarce resource,' effect mentioned, because functions don't work that way. I would guess that the main argument for this behavior is to say that it is the only 'allowed' behavior, but, being useless, it should just be removed. But it would be better to argue that the instance should be removed directly. -- Dan

I wound find it interesting to learn what exactly will break when this instance is removed / what feature motivated the addition of the einstance. I'm taking a guess here, maybe it's Generics based deriving? data D = D { something :: Int, somethingElse :: Int -> String } deriving Generic deriving instance NFData D Is it this?
participants (7)
-
Boespflug, Mathieu
-
Dan Doel
-
David Feuer
-
Henning Thielemann
-
Herbert Valerio Riedel
-
Michael Walker
-
Niklas Hambüchen