Hello, Cafe!

There is an opinion that Bool type has problems. It's "dangerous", because it's not good to be used as flag for success/fail result. I read this post: https://existentialtype.wordpress.com/2011/03/15/boolean-blindness/ and was shocked. How popular is such opinion? Is it true, that bool is "bad" type?

As I understand arguments of the post, Bool is bad because: 1) you don't know what does True/False mean  2) after comparison you get bit (!) only but then you may need to  "recover" the comparing value which was "shrink" to bit already.

Let's begin from 2. As I understand the author talks about one register computers ;) if he have to "recover" a value. But shown examples are in ML, where function arguments are bound to all function body, so you don't need to "recover" anything, what was bound as function argument or with "let". Sounds totally weird and more close to psychology than to CS :)

Argument #1 is more interesting.

A Boolean, b, is either true, or false; that’s it.  There is no information carried by a Boolean beyond its value, and that’s the rub.  As Conor McBride puts it, to make use of a Boolean you have to know its provenance so that you can know what it means.

Really, what does True/False mean? How to find semantic of True? It's very simple, because there is A) contract/interface which interprets True/False and also B) there is a help from science.

A) There are a lot of languages (Unix shell, ML, Basic, Haskell, C/C++...) with short-circuit expressions. Ex.,

e1 || e2
e1 && e2
e1 orelse e2

where interface is described by its operations: ||, &&, orelse, etc and it has absolutely accurate and clear meaning: "||" executes e2 iff e1 fails, was not success. "&&" executes e2 iff e1 was succeeded. I don't use words "True" and "False". Because, in different languages marker of success/fail is different. For example, in Bash, the fail is any integer except 0. In Haskell fail is False. In C is 0... What does mean False (and True) is defined by contract/interface of short-circuit operations, related to boolean algebra. A rare case when type is bound with semantic! We read them literally (native English): e1 or-else e2!

This means that using of False to indicate success - is error! And no way to miss provenance/knowledge what True or False means.

(the same: what does Right/False mean?)

B) The help from science. Math (and CS) has own history. And one of its mail-stones was birth of formal logic and then of Boolean algebra. CS implemented those in declarative languages (Prolog, for example). If we have some predicate in Prolog, "true" for that predicate means "it was achieved", as goal. If that predicate has side-effects, "true" means it was achieved, i.e. all its steps (side-effects) were successfully executed. Predicate write_text_to_file/2 is "true" when it wrote text to file. And no way to return False on success or to think about sacral sense of True/False :) And that sense traditionally is the same in all programming language. If you invert it, you deny contract, semantic and begin to use "inverted" logic :)

We can repeat the same logic with 3.1415926.. What does it mean? Meaning, semantic is described, but contract/interface: this magic irrational was born from part of algebra, called trigonometry. And this algebra defines semantic of Pi, not programmer's usage of Pi, programming context, etc. True/False semantic is defining by its algebra: boolean. So, programmer should not change their semantic, am I right?

So, my question is: is this post a april 1st trolling or author was serious? :)

---

Best regards, Paul