
Btw, there's just one thing I'm worried about with keeping those large MinGW binary tarballs in a Git repo:
The Git repo will grow monotonically with each new compressed .tar.{bz2,lzma,gz,...} added, with little opportunity for Git to detect shared bitstreams. So effectively each MiB of binary-data added will effectively grow the Git repo everyone will have to clone (even if only the latest MinGW for a specific 32/64-bit platform is desired) by that same amount.
Right now, cloning the ghc-tarballs.git repo requires to fetch ~130MiB.
Can't we simply put the tarballs in a plain HTTP folder on http://ghc.haskell.org, and store a list (or rather a shell script) of URLs+checksums in ghc.git to retrieve the tarballs if needed on demand? I agree with this. Having binaries in git is really dirty for several reasons. It would be cleaner to retrieve src and build it through the build system. But I suspect Windows people don't commonly do this(?), so checking for the binaries, and if they're not found, downloading
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA256 On 10/06/14 10:42, Herbert Valerio Riedel wrote: them (or asking the user to fix their paths) would likely suffice. The binaries should in any event not be in git... And as hvr points out, tarballs are a mess by themselves regardless of whether they contain binaries or source, because git (rightly) thinks they are blobs. Apologies for any assumptions made in this email that don't hold true - -- I do not use Windows. - -- Alexander alexander@plaimi.net https://secure.plaimi.net/~alexander -----BEGIN PGP SIGNATURE----- Version: GnuPG v2.0.22 (GNU/Linux) Comment: Using GnuPG with Thunderbird - http://www.enigmail.net/ iF4EAREIAAYFAlOW0e8ACgkQRtClrXBQc7WzFAD9Fcz+Ur80Dh2wCqrZxGYEABEC QetdlxKC49nCPfLRfwwA/0e4PdVDv6+af7eeHFmUKXf5nDjxPx8ex3i+PpWC/xgL =tE6K -----END PGP SIGNATURE-----