On Wed, 2011-07-27 at 14:23 +0200, Zdeněk Pavlas wrote: > A simple interface to external downloading processes, with > limited support for transparent retries and mirrorlist cycling. > --- > yum/yumRepo.py | 79 > ++++++++++++++++++++++++++++++++++++++++++++++++++++++++ > 1 files changed, 79 insertions(+), 0 deletions(-)
This is a somewhat better PoC, I guess. urlgrabber is now not involved at all, AFAICS. It's running an sh script that doesn't take any params. There is no way to get progress data out, AFAICS. The API both returns before what you've requested is downloaded _and_ can block for an indeterminate amount of time. The downloaders are global, so one downloader might talk to rpmforge.org and redhat.com ... also means keepalive is going to be interesting. Not sure why you made the downloaders process global too. This uses select directly instead of poll, is there some reasoning? asyncore next (might be the best yet ?:). If you are stuck trying to solve "the big problem" all at once, and are sending out this is an update of where you are atm. ... I can understand, but you'll probably go crazy trying to do it that way (and maybe take me with you ;). My suggestion would be to solve a small part of the problem fully. Eg. get an "almost 100%" patch for urlgrabber.grab() that spawns a single process, does the download and returns progress info. Then when we've got that, we can start from that base so it can run 2 procs. at once ... then we can look at urlgrabber APIs so it can spawn 2 procs. at once, then ... eventually the pain needed to get it integrated into yum. _______________________________________________ Yum-devel mailing list [email protected] http://lists.baseurl.org/mailman/listinfo/yum-devel
