
Over the past years I became more and more aware that common mathematical notation is full of inaccuracies, abuses and stupidity. I wonder if mathematical notation is subject of a mathematical branch and whether there are papers about this topic, e.g. how one can improve common mathematical notation with the knowledge of functional languages. Things I'm unhappy about are for instance f(x) \in L(\R) where f \in L(\R) is meant F(x) = \int f(x) \dif x where x shouldn't be visible outside the integral O(n) which should be O(\n -> n) (a remark by Simon Thompson in The Craft of Functional Programming) a < b < c which is a short-cut of a < b \land b < c f(.) which means \x -> f x or just f All of these examples expose a common misunderstanding of functions, so I assume that the pioneers of functional programming also must have worried about common mathematical notation.