Hello Simon, On 2014-06-10 at 10:25:46 +0200, Simon Peyton Jones wrote:
[...] > We physically include very selective chunks of MinGW in a GHC Windows > distribution > - so that users don't need to install MinGW > - so that GHC doesn't break just because a user has > a different version of MinGW than we expected > We keep these chunks of MinGW in the GHC repo (in ghc-tarballs) > precisely so that we know exactly which bits to ship. Btw, there's just one thing I'm worried about with keeping those large MinGW binary tarballs in a Git repo: The Git repo will grow monotonically with each new compressed .tar.{bz2,lzma,gz,...} added, with little opportunity for Git to detect shared bitstreams. So effectively each MiB of binary-data added will effectively grow the Git repo everyone will have to clone (even if only the latest MinGW for a specific 32/64-bit platform is desired) by that same amount. Right now, cloning the ghc-tarballs.git repo requires to fetch ~130MiB. Can't we simply put the tarballs in a plain HTTP folder on http://ghc.haskell.org, and store a list (or rather a shell script) of URLs+checksums in ghc.git to retrieve the tarballs if needed on demand? Cheers, hvr _______________________________________________ ghc-devs mailing list ghc-devs@haskell.org http://www.haskell.org/mailman/listinfo/ghc-devs