
On 11/30/2015 03:59 AM, Joachim Durchholz wrote:
Am 30.11.2015 um 00:24 schrieb Michael Orlitzky:
GHC does dynamic linking now, but I'm OK with static linking as long as it's tracked. The end result is the same as if you had dynamic linking, only with a lot more wasted space and rebuilds/reinstalls.
Well, the idea was to use static linking to keep the libraries themselves outside the view of the package manager. I.e. the libraries aren't tracked inside the package manager, they become part of your upstream.
I get the idea, but it doesn't work in general. To use a cheesy example, what if OpenSSL was statically linked into everything when heartbleed was announced? If the static linking goes untracked, how do you fix it? You need to rebuild everything that was linked against OpenSSL, but how do you find those packages and rebuild them? Can you explain your answer to a typical "Windows Update" user? Less-serious vulnerabilities pop up every day and require the same attention, so whatever solution you come up with needs to be fast and automated.
I do wonder about the "a lot more wasted space" bit. How much space are we really talking about?
Compiled programs aren't too bad. If you pull in a ton of libraries, you might get 50MB overhead for a huge program. But if you go full retard like NodeJS and bundle recursively, you can find yourself pulling in 500MB of dependencies for helloworld.js. That's not anyone's main objection though. If we could fix dependencies by wasting disk space everyone would be on board.