* 白熊 <[email protected]> [2014-08-04 12:12:03+0400] > Hello everybody: > > I'm learning Guile. I'd like to write a utility, which will enable me > upon user input download program sources from the web, i.e. not a > web-spider, but download discrete URL's and save them to disk. > > I've looked at: > > http://lists.gnu.org/archive/html/guile-user/2012-04/msg00032.html > http://lists.gnu.org/archive/html/guile-user/2012-05/msg00005.html > > so the possibility to do, seems to be there. > > Specifically, let's say, I want to download: > > http://ftp.gnu.org/pub/gnu/guile/guile-2.0.11.tar.xz > > and save it to ~/ > > How should I modify the following code to do this?: It seems that `http-get` do not manage SSL, and GNU site 302 to https version (or is it just my proxy config?)
So here is example for other url:
(use-modules ((web uri) #:select (string->uri))
((web client) #:select (http-get)))
(use-modules (rnrs io ports))
(define *url*
"http://hackage.haskell.org/package/HaTeX-2.1.3/HaTeX-2.1.3.tar.gz")
(call-with-values
(lambda () (http-get (string->uri *url*)))
(lambda (res-headers res-body)
(with-output-to-file "some.tar.gz"
(lambda () (put-bytevector (current-output-port) res-body))
#:binary #t)))
--
Best regards, Dmitry Bogatov <[email protected]>,
Free Software supporter, esperantisto and netiquette guardian.
GPG: 54B7F00D
pgpKOUQ7b020E.pgp
Description: PGP signature
