On Sun, 6 May 2012, Stuart Henderson wrote:

On 2012-05-06, Renzo Fabriek <rfabr...@nerdshack.com> wrote:
On Sunday 06 May 2012 18:24:21 Alan Corey wrote:
I just saw another good reason to hit ctrl-C.  I'm on a modem, and I just
hit boost_1_42_0.tar.gz in an install.  That's 40 megs, more than I can
download in a day.  I need to use a different process, like put the url in
a text file and feed it to wget with --continue -i <file>

So I hit ctrl-C, and that got me out of downloading from the first site,
only to start downloading from another, then another.  What I really
wanted was to be able to copy a working url and paste it into a text file.

   Alan

On Wed, 25 Apr 2012, Alan Corey wrote:

I've seen this before, I wonder if there's some environment variable I can
set to stop it?

I try make fetch on a port, it fails due to a bad site.  I hit Ctrl-C to stop
it, it goes to the next site and downloads the file.  Then it deletes the
file when it finishes.  I type make install and it tries the bad site
again...

 Alan



You are able to get a working url. All info for that url is in the Makefile of 
the port. You could cut and paste an url or do some shell scripting to extract 
the url's.

gr
Renzo



It's a bit awkward actually but can be done. You have to take
output of 'make show=DISTFILES' and prepend contents of 'make
show=MASTER_SITES' (which can give you multiple URLs), except
where the distfile ends in :0, :1, :2, ... :9 in which case you
need MASTER_SITES0, ...1, etc. Similar for PATCHFILES and maybe
SUPDISTFILES..

You can't do it reliably without 'make show=...' because the
URL can come from a *.port.nk modules, Makefile.inc or
network.conf. And sometimes you need to fallback a backup at
an openbsd.org server.



Yeah, at one point when I was getting 5.0 set up I actually took my laptop somewhere with a WiFi connection. I must have already installed databases/sqlports because I extracted a list of distfiles from that, matched it against what I already had from 4.7, then put the site url on the beginning of each line and .tar.gz on the end and fed it to wget. I got about 100 megs of distfiles in about 15 minutes and saved about 10 hours of downloading by modem. Of course they aren't all tar.gz and the site didn't work for all of them, but it helped. I'm still working on parts of LibreOffice.

  Alan

Reply via email to