
Shouldn't IOError be identified with IOException rather
Ross Paterson
cryptically writes: than Exception? I had to grovel through the code to understand what this question means.
Well, you could have grovelled through the documentation instead :-) http://www.haskell.org/ghc/docs/latest/html/base/Control.Exception.html
It seems that GHC.IOBase contains these definitions:
type IOError = Exception
data Exception = ArithException ArithException | ... | IOException IOException | ...
data IOException = IOError { ioe_handle :: Maybe Handle, -- the handle used by the action flagging -- the error. ioe_type :: IOErrorType, -- what it was. ioe_location :: String, -- location. ioe_descr :: String, -- error type specific information. ioe_filename :: Maybe FilePath -- filename the error is related to. }
I think the idea is that we want code protected using catch clauses to be safe against not just IOErrors but also all the exceptions (division by zero, etc) that can happen. Since the type of 'catch' is fixed by the report, we have to make IOError include all those types.
It didn't always used to be this way: before GHC 5.00, IOError was what is now called IOException. We changed it so that IOError == Exception because it seems simpler this way: IO.ioError can be used to throw exceptions, and Exception.catch and IO.catch have the same type. I think there were more good reasons, but I can't remember now (the change came about when Simon P.J. was trying to describe this stuff for his "awkward squad" paper). Personally I'm not completely happy with the design, the IOError==Exception thing is a bit strange. But most of the complication arises if you try to mix the two interfaces to exceptions (IO and Exception) - if you stick to the Exception interface then the design is quite consistent.
The report deliberately specifies IOError in such a way that additional kinds of IOError can be easily added.
This only leaves the question of whether division by 0 is an IO error or is a pure error. The argument is that the only thing which causes evaluation to happen is IO - if your program doesn't interact with the outside world, it might as well not do anything. So, in that sense, all errors are errors triggered by performing IO. [You can disagree with this argument if you like - the fact will remain that we'd like Prelude.catch to catch as many exceptions as possible even if the names of the types seem a little screwy when we do that.]
You can't have Prelude.catch catch division by zero and pattern match failures (for example), because that wouldn't be Haskell 98. That's why we have Exception.catch which does catch these errors. Cheers, Simon