
If anybody else is interested.
For now, I settled with a Stack based solution.
`STACK_ROOT=<some-temp-folder> stack build --dry-run --prefetch`
works like a charm. What I didn't tackle yet is having multiple ghc
installations in that folder. Somehow I will find a solution for that as
well.
For now I live with the fact that
`STACK_ROOT=<extracted-archive-from-prior-step> stack build --install-ghc`
will connect to the internet to download ghc.
Thanks for all the responses. I'll send an update when the ghc thing is
solved as well.
Best
Jan
Herbert Valerio Riedel
Hi,
On 2016-11-24 at 07:57:57 +0100, Jan von Löwenstein wrote:
Consider the following use case: I have to ship a Haskell application as source, bundled with its dependencies and a script that can produce the binary on a machine without internet connectivity.
This situation is not that uncommon; one example is when you have build-bots which a more or less cut off from the internet (such as Ubuntu's launchpad buildfarm)
Is that possible with Stack or Cabal? Any pointers would be much appreciated.
Ideally I would like to build each dependency package individually. That way I could cache results per Haskell package and don't need to rebuild dependencies until they actually change.
This sounds like something I'd use cabal's filesystem-local repository feature[1] for, and it should be possible to specify such local repositories in your `cabal.config` or `cabal.project`[2] files to produce a self-contained source distribution.
If you're interested in pursuing this approach or have any questions about it, let me know!
[1]: http://cabal.readthedocs.io/en/latest/installing-packages.html#repository-sp... [2]: http://cabal.readthedocs.io/en/latest/nix-local-build-overview.html
Cheers