
Mike Gunter
writes: Why is executable size a barrier? 1.64 megabytes (that's the size of the executable I built with GHC most recently) of disk space costs less than half a cent.
I don't like this argument. Can I go to a computer store, pay a cent, and get a hard disk with space 1.64 megabytes or more? Until then, I can't believe that 1.64 megabytes of disk space costs less than half a cent.
When a compiler does not perform as good as other compilers (e.g., in terms of generated code size), it is important to ask: Why does it happen? Is there anything we can do to improve it? Being critical is the first step towards progress. (Of course these questions should be asked in a constructive rather than whining way.) Why would anyone optimize code for time --- a second of electricity and labour cost less than a cent...
Of course, we're always looking for ways to reduce the size of binaries. But the situation might not be as bad as you think; firstly, don't forget to strip the binary if you're worried about disk space, since the symbol table in a GHC-generated binary can be quite large (it doesn't affect the runtime or anything else, though). The subject of shared libraries has come up several times in the past - take a look through the archives for some of the previous discussions. The upshot is that shared libraries wouldn't really buy much unless you really need to save the disk space: in all other considerations, static linking comes out better. Unfortunately GHC-compiled libraries are very tightly coupled, which means it's unlikely you'd be able to swap out a shared library for a newer version unless it was compiled with *exactly* the same compiler and set of libraries as the old version. Cheers, Simon
participants (1)
-
Simon Marlow