Excerpts from Eelco Dolstra's message of Mon Nov 19 16:31:26 +0100 2012: > Why would you need a "double fetch"? After running fetchgit, the Git tree is > in > the Nix store and shouldn't be downloaded again unless you do a garbage > collect > in between. You're right about this. I want to make bundler (which dynamically fetches updates for dependencies of ruby packages) use the nix store to share git sources and gem install results.
nixpkgs-ruby-overlay gets the job done, and I could manually package all git sources additionally to the packages found on rubyforge. It just takes too long. I want to work like other ruby using people do: bundle update (fetch all dependencies, and if this was done previously reuse store paths) Of course running nix-prefetch-git is an option, however checking whether a store path representing { url = ..; hash = .. } already exists is harder. If you run nix-prefetch-git twice it will fetch twice (waste). I haven't looked for options. If nix could handle this, I could just create a .nix file and I'd always get what I want: the source - if it exists I would not have to bother at all. About changeroot builds: You're right. So mabye a hacky mkDerivation { allownetwork = true; } would do. It could be used for such cases. Why should it be allowed? If a programmer wants to shoot himself into the food, you can't prevent him doing so. Thus the goal should be making it hard to do it by accident. And this property still holds if allownetwork = true or such existed. So comment on whether you see huge security risks using git url and git's hash only. Also mind that I don't say that sha256 checks for fetchgit should no longer be used. I just think its not worth bothering for use cases where other tools neither do (such as bundler for ruby) - they don't even bother to use the full git hash length (which is bad IMHO). Marc Weber _______________________________________________ nix-dev mailing list nix-dev@lists.science.uu.nl http://lists.science.uu.nl/mailman/listinfo/nix-dev