
30 Mar
2017
30 Mar
'17
5:49 p.m.
On Thu, 30 Mar 2017, Bryan Richter wrote:
On Thu, Mar 30, 2017 at 11:18:10PM +0200, Henning Thielemann wrote:
I think the only reason is that some programmers got "laziness" wrong and try to use primitive types for everything instead of using (and importing) dedicated types.
Well, is there anything to be done for it at this point? Is there even any consensus that this was, in retrospect, a poor choice?
The community was and is pretty divided, I think. My suggested compromise is to get compiler warnings if you use certain instances (by accident).