On Sat, Feb 9, 2013 at 3:56 AM, Johan Holmquist <holmisen@gmail.com> wrote:
The code goes into production and, disaster. The new "improved"
version runs 3 times slower than the old, making it practically
unusable. The new version has to be rolled back with loss of uptime
and functionality and  management is not happy with P.

It just so happened that the old code triggered some aggressive
optimization unbeknownst to everyone, **including the original
developer**, while the new code did not. (This optimization maybe even

This leads ultimately to not allowing compilers to optimize at all.  I suspect that's a bad plan.  Keep in mind that a modern web application may be heavily enough used that it doesn't even need to be a "hyper-optimization"; even small changes in performance can scale to large performance differences.

Also... what happens when it's not just manual optimization but a bug fix that triggers this?

Maybe this is something that would never happen in practice, but how
to be sure...
 
If this really scares you, disable all compiler optimization.  Now you can be sure even at large scales where even small changes can have huge effects... and now you'd better be good at hand optimization.  And writing code in assembly language so you can get that optimization.

This sounds like going backwards to me.

--
brandon s allbery kf8nh                               sine nomine associates
allbery.b@gmail.com                                  ballbery@sinenomine.net
unix, openafs, kerberos, infrastructure, xmonad        http://sinenomine.net