
If anyone wants to talk about COBOL, it's worth remembering that there have been several major versions of COBOL with major differences. COBOL 68 was, well, meh. COBOL 74 was quite a dramatic improvement, arguably bigger than Fortran 66 -> Fortran 77. COBOL 85 was another dramatic improvement, finally permitting structured programming with user-defined procedures. About comparable to the Fortran 77 -> Fortran 90 jump, where Fortran got free format, full set of structured programming structures, modules, nested procedures, and other good stuff. COBOL development did not stop there. COBOL 2002 added free format, user defined functions (that is, usable expressions), recursion (just 42 years after Algol 60...), locales, Unicode, a Boolean data type, pointers, form-based screen interfaces. Oh, and object orientation! I have to say that OO COBOL is in my view uniquely horrible.., but this is not the place to defend that view. COBOL 2014 added native support (seriously weird native support, but native support) for XML, and a set of container classes (these had been issued by 2009). One of the merits of COBOL 85 was that the language lacked features that make static analysis difficult, so it was possible to analyse COBOL programs quite thoroughly. The additions to COBOL 2002 have pretty comprehensively destroyed that property, without making COBOL into anything an OO programmer would really _want_ to use. Perhaps it is not surprising that a large amount of COBOL code is still said to be fairly straight COBOL 85, even COBOL 74. The non-business world seems to mostly thinks of COBOL as if it were still 1970, such as the non-heavy-duty-numerics world seems to still think of Fortran as if it were still 1970. Back when it *was* the 1970s, COBOL enjoyed a good reputation for portability, which more academically favoured languages by and large did not. Much of this comes down to arithmetic: 77 MY-COUNT PICTURE S999999 USAGE COMPUTATIONAL. makes it clear that I want something that can hold at least -999,999 .. +999,999 that's good for calculating with, and the rules of COBOL say that I *must* get it, even on a 16-bit machine. In contrast, if I write INTEGER MYCNT -- Fortran or var mycount: integer; -- Pascal then I get what I'm given, and even if I ask for var mycount: -999999 .. 999999; a Pascal compiler is under no obligation to satisfy me. So COBOL gave accountants portable decimal arithmetic on numbers of up to 18 digits (now 31), *regardless* of the underlying machine. With overflow detection a *standard* feature (still not available in Java or C#). Commercial COBOL compilers have got very good at what they do, and the idea of replacing a COBOL program working on COBOL-like data by a Java program using BigDecimal is not one that appeals unless you are desperate to slow down a computer that's too fast. Of course, nothing says you couldn't have a decent structured modular programming language with strong static typing that was capable of working on COBOL data. It's called Ada. And again, nothing stops someone making a nice structured language whose compiler targets COBOL. Come to think of it, there's no reason there couldn't be a Haskell compiler with really efficient support for fixed-point decimal numbers. It just seems unlikely that Haskell programmers would want it. As other people have noted, sometimes the environment of a language is as important as the language itself. I've never used Eclipse -- though I've tried and failed -- but Eclipse support for COBOL is said to be excellent.