
I'm sure we could make git handle the tarballs, but it just seems like the
wrong tool for the job. We'd have to use multiple advanced features of git
where a simple wget/curl would do. Versioning is also a moot point, since
we would embed versions in filenames. In fact, versioning would be easier
and nicer when the filenames with versions are in a file on the main
repository rather than in a submodule.
I was thinking of performing the wget (if necessary) in the Makefile, to
further bring down the number of steps that users have to execute for a
working build. Any strong objections?
Whom should I contact to get some static files deployed in a folder under
haskell.org?
On Mon, Sep 29, 2014 at 11:40 AM, Thomas Miedema
3. Why is ghc-tarballs a git repository? That does not seem very wise. [...]
Could we have a stable folder under haskell.org/ to put the files in, to make sure that they never go away, and just wget/curl them from there?
http://thread.gmane.org/gmane.comp.lang.haskell.ghc.devel/4883/focus=4887
Hmm, that was a while ago. Whom should I contact to get the files deployed under haskell.org?
Here's a different solution to the 'big binary blobs' problem:
* Keep the ghc-tarballs git repository, and add it as a submodule * Make sure it doesn't get cloned by default git config -f .gitmodules submodule.ghc-tarballs.update none * Windows developers run (after initial clone --recursive of the ghc repository, one time): git config submodule.ghc-tarballs.update checkout git submodule update --depth=1 * After that, windows developers run the normal: git submodule update
The advantages are: * only the most recent ghc-tarballs commit gets cloned initially * subsequent 'git submodule update' runs will make sure always the most recent version of ghc-tarballs is available * full history of ghc-tarballs is tracked, easier bisecting * no extra scripts needed
I don't know how much space overhead git adds. wget-ting just the files themselves might still be faster.
-- Gintautas Miliauskas