Fwd: Fwd: Compatibility etiquette for apps, with cabal sandboxes and `stack`

On 29 November 2015 at 20:12, Michael Orlitzky
On 11/29/2015 01:37 PM, Omari Norman wrote:
Distribution packagers are savvy enough to use stack.
Ignoring the question of *how* that might work, most distributions forbid bundled dependencies because it creates a maintenance nightmare and fills our users' machines with untraceable security vulnerabilities.
But doesn't Haskell do static linking (usually) and cross-module inlining? Or are you fine with static linking as long as it's somehow tracked by the package manager, so that upgrading some-vuln-lib from 1.0 to 1.1 forces upgrading all client programs (looks quite doable at least with Debian packages)? -- Paolo G. Giarrusso - Ph.D. Student, Tübingen University http://ps.informatik.uni-tuebingen.de/team/giarrusso/ -- Paolo G. Giarrusso - Ph.D. Student, Tübingen University http://ps.informatik.uni-tuebingen.de/team/giarrusso/

On 11/29/2015 06:11 PM, Paolo Giarrusso wrote:
On 29 November 2015 at 20:12, Michael Orlitzky
wrote: On 11/29/2015 01:37 PM, Omari Norman wrote:
Distribution packagers are savvy enough to use stack.
Ignoring the question of *how* that might work, most distributions forbid bundled dependencies because it creates a maintenance nightmare and fills our users' machines with untraceable security vulnerabilities.
But doesn't Haskell do static linking (usually) and cross-module inlining? Or are you fine with static linking as long as it's somehow tracked by the package manager, so that upgrading some-vuln-lib from 1.0 to 1.1 forces upgrading all client programs (looks quite doable at least with Debian packages)?
GHC does dynamic linking now, but I'm OK with static linking as long as it's tracked. The end result is the same as if you had dynamic linking, only with a lot more wasted space and rebuilds/reinstalls.

Am 30.11.2015 um 00:24 schrieb Michael Orlitzky:
GHC does dynamic linking now, but I'm OK with static linking as long as it's tracked. The end result is the same as if you had dynamic linking, only with a lot more wasted space and rebuilds/reinstalls.
Well, the idea was to use static linking to keep the libraries themselves outside the view of the package manager. I.e. the libraries aren't tracked inside the package manager, they become part of your upstream. If you insist on tracking the libs inside the package manager, then you retain all the disadvantages of dynamic linking (inability to work with mutually incompatible library version requirements) and static linking (bigger space requirements). I do wonder about the "a lot more wasted space" bit. How much space are we really talking about? Regards, Jo

Well let's think of books vs tweets. Books: take longer to write are written to last for at least a couple years are expected to be read by a number of readers, over time are planned are structured / organized cover a few topics in depth are reviewed are proof read some books become out of date may contain errata There are different editions. There is usually time span between them. Sometimes authors change title between editions. There are different books about similar topics. Tweets are in several ways just the opposite of books. I like to think of software libraries as books. Good libraries are like hardcover editions.

On 11/30/2015 03:59 AM, Joachim Durchholz wrote:
Am 30.11.2015 um 00:24 schrieb Michael Orlitzky:
GHC does dynamic linking now, but I'm OK with static linking as long as it's tracked. The end result is the same as if you had dynamic linking, only with a lot more wasted space and rebuilds/reinstalls.
Well, the idea was to use static linking to keep the libraries themselves outside the view of the package manager. I.e. the libraries aren't tracked inside the package manager, they become part of your upstream.
I get the idea, but it doesn't work in general. To use a cheesy example, what if OpenSSL was statically linked into everything when heartbleed was announced? If the static linking goes untracked, how do you fix it? You need to rebuild everything that was linked against OpenSSL, but how do you find those packages and rebuild them? Can you explain your answer to a typical "Windows Update" user? Less-serious vulnerabilities pop up every day and require the same attention, so whatever solution you come up with needs to be fast and automated.
I do wonder about the "a lot more wasted space" bit. How much space are we really talking about?
Compiled programs aren't too bad. If you pull in a ton of libraries, you might get 50MB overhead for a huge program. But if you go full retard like NodeJS and bundle recursively, you can find yourself pulling in 500MB of dependencies for helloworld.js. That's not anyone's main objection though. If we could fix dependencies by wasting disk space everyone would be on board.
participants (4)
-
Imants Cekusins
-
Joachim Durchholz
-
Michael Orlitzky
-
Paolo Giarrusso