Before speaking of "Apl's mistakes", one should be
clear about what exactly those mistakes *were*.
I should point out that the symbols of APL, as such,
were not a problem. But the *number* of such symbols
was. In order to avoid questions about operator
precedence, APL *hasn't* any. In the same way,
Smalltalk has an extensible set of 'binary selectors'.
If you see an expression like
a ÷> b ~@ c
which operator dominates which? Smalltalk adopted the
same solution as APL: no operator precedence.
Before Pascal, there was something approaching a
consensus in programming languages that
** tightest
*,/,div,mod
unary and binary +,-
relational operators
not
and
or
In order to make life easier with user-defined
operators, Algol 68 broke this by making unary
operators (including not and others you haven't
heard of like 'down' and 'upb') bind tightest.
As it turned out, this make have made life
easier for the compiler, but not for people.
In order, allegedly, to make life easier for
students, Pascal broke this by making 'or'
and 'and' at the same level as '+' and '*'.
To this day, many years after Pascal vanished
(Think Pascal is dead, MrP is dead, MPW Pascal
is dead, IBM mainframe Pascal died so long ago
it doesn't smell any more, Sun Pascal is dead, ...)
a couple of generations of programmers believe
that you have to write
(x > 0) && (x < n)
in C, because of what their Pascal-trained predecessor
taught them.
If we turn to Unicode, how should we read
a ⊞ b ⟐ c
Maybe someone has a principled way to tell. I don't.
(∨,True
) (∧, False), ∧, ∨ should have the same precedence.
Evidently not too many people agree with him!
DOCTYPE
found!