The Proper Definition of (evaluate :: a -> IO a)

-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 The obvious "evaluate x = x `seq` return x" fails one of the following laws for evaluate: " evaluate x `seq` y ==> y evaluate x `catch` f ==> (return $! x) `catch` f evaluate x >>= f ==> (return $! x) >>= f Note: the first equation implies that (evaluate x) is not the same as (return $! x). " However, strictness does not obey the monad laws. A correct definition for IO would be "evaluate x = (x `seq` return x) >>= return", or equally, "(return $! x) >>= return", or even more horrible-looking, "fmap id (return $! x)"! This works because (at least in present Haskell implementations) IO's (>>=) is not strict in its first argument. Noticing that this function has been implemented (slightly wrongly) for NHC in base, so I mention this odd suggestion. Isaac -----BEGIN PGP SIGNATURE----- Version: GnuPG v1.4.6 (GNU/Linux) Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org iD8DBQFGPHKtHgcxvIWYTTURAmT2AKC+pipeHff1jVOCqxjOOz2p+6/JqQCeNqAN bjkaLwbn+BRVCya2e49diF4= =J9iV -----END PGP SIGNATURE-----

Isaac Dupree
The obvious "evaluate x = x `seq` return x" fails one of the following laws for evaluate:
evaluate x `seq` y ==> y
I'm not sure why anyone thinks this "law" should hold, since it completely changes the known semantics of `seq`. A more accurate law would be: evaluate x `seq` y ==> if x==_|_ then _|_ else y Where did you find the erroneous version? Regards, Malcolm

On Mon, 07 May 2007, Malcolm Wallace
Isaac Dupree
wrote: The obvious "evaluate x = x `seq` return x" fails one of the following laws for evaluate:
evaluate x `seq` y ==> y
I'm not sure why anyone thinks this "law" should hold, since it completely changes the known semantics of `seq`. A more accurate law would be:
evaluate x `seq` y ==> if x==_|_ then _|_ else y
You seem to be assuming that evaluate ⊥ = ⊥, which is not necessarily the case.
Where did you find the erroneous version?
Presumably he found it by reading the documentation for evaluate: http://haskell.org/ghc/docs/latest/html/libraries/base/Control-Exception.htm... -- /NAD

Nils Anders Danielsson
Where did you find the erroneous version? Presumably he found it by reading the documentation for evaluate: http://haskell.org/ghc/docs/latest/html/libraries/base/Control-Exception.htm...
Actually, I checked the source code for Control.Exception before asking the question, and those laws definitely do not appear there. So now I'm kind of puzzled as to how Haddock managed to generate documentation for them!
evaluate x `seq` y ==> y
I'm not sure why anyone thinks this "law" should hold,
You seem to be assuming that evaluate x = x, which is not necessarily the case.
Yes, I did make a mistake when I asked the question, in assuming some intuitive semantics rather than the actually written ones. So taking the law at face value, it seems to say that 'evaluate' is supposed to force its argument, yet the result should be guaranteed to be in WHNF, even if the argument was in fact undefined. Given that the result is in the IO monad, it can only mean one of two things: (1) That 'evaluate' should _not_ force its argument when it is called, but that rather that the argument should be forced only when the resultant IO action is executed. This conflicts with the documentation, which implies that the argument is forced _before_ the IO action is created or executed. But I guess this semantics accords with Isaac's suggested definition: evaluate x = (x `seq` return x) >>= return (2) A constraint on the internal representation of IO computations, such that it must implement imprecise exceptions. Thus, 'evaluate' could catch any exception caused by undefinedness in its argument, and wrap it up as a defined value again, e.g. evaluate x = (x `seq` return x) `catch` (\e-> IO (\world-> (x,world))) Hmm, actually this _also_ has the property of delaying when x is forced to the moment of execution of the result. Maybe evaluate x = (x `seq` return x) `FATBAR` return x or something otherwise not expressible in Haskell without extra primitives? Of course, perhaps 'evaluate' is itself supposed to be primitive. I'm not sure I have expressed my reasoning very clearly, but either way, some clarification would be welcome. Regards, Malcolm

-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 Malcolm Wallace wrote:
Nils Anders Danielsson
wrote: Where did you find the erroneous version? Presumably he found it by reading the documentation for evaluate: http://haskell.org/ghc/docs/latest/html/libraries/base/Control-Exception.htm...
Actually, I checked the source code for Control.Exception before asking the question, and those laws definitely do not appear there. So now I'm kind of puzzled as to how Haddock managed to generate documentation for them!
In the source code, grep found it at the end of libraries/base/GHC/Exception.lhs
(1) That 'evaluate' should _not_ force its argument when it is called, but that rather that the argument should be forced only when the resultant IO action is executed. This conflicts with the documentation, which implies that the argument is forced _before_ the IO action is created or executed.
I didn't think so... though in most cases the IO action is only scrutinized/created in order to immediately execute it.
But I guess this semantics accords with Isaac's suggested definition: evaluate x = (x `seq` return x) >>= return
I think evaluate's non-strictness until the IO is executed is similar to IO's (>>=)'s non-strictness in its first argument (would there be be any performance consequences to this artificially being different, in GHC, I wonder?) except that evaluate is documented, and that behavior has a purpose re: exceptions considering that evaluate is found in Control.Exception and I haven't found anything indicating that IO's (>>=) must not be strict in the first argument, it's just that all implementations seem to do it that way, so, is my suggested implementation really a good idea? -----BEGIN PGP SIGNATURE----- Version: GnuPG v1.4.6 (GNU/Linux) Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org iD8DBQFGP67vHgcxvIWYTTURAuOvAJ9Xbua9wQN2IyoasgVEeF2Qp4dcYACfbY6i /Fvp7mUccUD329WHV4/5WQc= =r8fm -----END PGP SIGNATURE-----

On Mon, May 07, 2007 at 06:57:51PM -0400, Isaac Dupree wrote:
Malcolm Wallace wrote:
But I guess this semantics accords with Isaac's suggested definition: evaluate x = (x `seq` return x) >>= return
Let's try a little Equational Reasoning: evaluate x = (x `seq` return x) >>= return evaluate x = (>>=) (seq x (return x)) return evaluate x = bindIO (seq x (returnIO x)) returnIO evaluate x = IO (\st1 -> case unIO (seq x (returnIO x)) st1 of (# st2, a #) -> unIO (returnIO a) st2) evaluate x = IO (\st1 -> case unIO (seq x (returnIO x)) st1 of (# st2, a #) -> (# st2, a #)) evaluate x = IO (\st1 -> case seq x (\st3 -> (# st3, x #)) st1 of (# st2, a #) -> (# st2, a #)) evaluate x = IO (\st1 -> case (case x of __DEFAULT -> (\st3 -> (# st3, x #))) st1 of (# st2, a #) -> (# st2, a #)) evaluate x = IO (\st1 -> case x of __DEFAULT -> case (\st3 -> (# st3, x #)) st1 of (# st2, a #) -> (# st2, a #)) evaluate x = IO (\st1 -> case x of __DEFAULT -> case (# st1, x #) of (# st2, a #) -> (# st2, a #)) evaluate x = IO (\st1 -> case x of __DEFAULT -> (# st1, x #)) evaluate x = IO (\st1 -> case x `seq` () of () -> (# st1, x #)) evaluate x = GHC.Exception.evaluate x Stefan

Malcolm Wallace wrote:
Nils Anders Danielsson
wrote: Where did you find the erroneous version? Presumably he found it by reading the documentation for evaluate: http://haskell.org/ghc/docs/latest/html/libraries/base/Control-Exception.htm...
Actually, I checked the source code for Control.Exception before asking the question, and those laws definitely do not appear there. So now I'm kind of puzzled as to how Haddock managed to generate documentation for them!
evaluate x `seq` y ==> y I'm not sure why anyone thinks this "law" should hold, You seem to be assuming that evaluate x = x, which is not necessarily the case.
Yes, I did make a mistake when I asked the question, in assuming some intuitive semantics rather than the actually written ones. So taking the law at face value, it seems to say that 'evaluate' is supposed to force its argument, yet the result should be guaranteed to be in WHNF, even if the argument was in fact undefined. Given that the result is in the IO monad, it can only mean one of two things:
(1) That 'evaluate' should _not_ force its argument when it is called, but that rather that the argument should be forced only when the resultant IO action is executed. This conflicts with the documentation, which implies that the argument is forced _before_ the IO action is created or executed. But I guess this semantics accords with Isaac's suggested definition: evaluate x = (x `seq` return x) >>= return
That's exactly right. Evaluate was introduced because it does something different from (return $!). We tried in the documentation to be clear about its semantics, but perhaps we weren't clear enough. I'll add Isaac's suggested definition to the docs. Cheers, Simon

Simon Marlow wrote:
That's exactly right. Evaluate was introduced because it does something different from (return $!). We tried in the documentation to be clear about its semantics, but perhaps we weren't clear enough. I'll add Isaac's suggested definition to the docs.
If Isaac's definition is accurate, we might as well generalise it to any Monad. -- Ashley Yakeley

On Tue, May 08, 2007 at 05:47:27PM -0700, Ashley Yakeley wrote:
Simon Marlow wrote:
That's exactly right. Evaluate was introduced because it does something different from (return $!). We tried in the documentation to be clear about its semantics, but perhaps we weren't clear enough. I'll add Isaac's suggested definition to the docs.
If Isaac's definition is accurate, we might as well generalise it to any Monad.
Isaac's definition is equivalent to the standard evaluate as I proved. However, my proof could be using misfeatures of IO. Stefan

-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 Stefan O'Rear wrote:
On Tue, May 08, 2007 at 05:47:27PM -0700, Ashley Yakeley wrote:
If Isaac's definition is accurate, we might as well generalise it to any Monad.
Isaac's definition is equivalent to the standard evaluate as I proved. However, my proof could be using misfeatures of IO.
Monads like Maybe and [] have no definition of evaluate consistent with all the laws, they only have (return) and (return $!) as distinct possibilities. In which case my definition makes evaluate in those monads be equivalent to (return $!). Which might not be that bad. For monads such as Lazy ST, it is clear that (>>=) cannot be strict in its first argument in order for the laziness to work, so evaluate can be defined with all the laws. In other function-based monads - including IO - it is not so documented or necessary. What is evaluate used for? Based on Google codesearch and my own experiences: In a forkIO for the sole purpose of parallel evaluation. Could be summarized by: forkEvaluate :: a -> IO a forkEvaluate a = forkIO (evaluate a >> return ()) Aren't there supposed to be more functional ways like `par` to achieve this? Anyway, looks like that definition only works in the IO monad. To immediately catch exceptions from it (Control.Exception.catch or try). This also only works in the IO monad (or derived monads that can define catch... there is always (liftIO . evaluate) I guess). To simply make things be evaluated sooner. This is sometimes necessary, to counteract the effects of file-reading using unsafeInterleaveIO, for example. I haven't seen this use in non-IO monads (which might violate separation of concerns) but it could be. In practice, even in IO code, (return $!) ends up being used. After all it is not hidden away in a Control.Exception module (that is officially non-portable perhaps). No opinion :) Isaac -----BEGIN PGP SIGNATURE----- Version: GnuPG v1.4.6 (GNU/Linux) Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org iD8DBQFGQ4W0HgcxvIWYTTURAiXIAKDWeixSdezmSz2bGCfxZ1Ox3MYMHQCghVKh xe9UYKbp/IVkx3t6thYj/xM= =ZmRd -----END PGP SIGNATURE-----
participants (6)
-
Ashley Yakeley
-
Isaac Dupree
-
Malcolm Wallace
-
Nils Anders Danielsson
-
Simon Marlow
-
Stefan O'Rear