Changes are local to the last 'po' loop. Error path is elliminated
with the 'errors' argument to getPackage(). OK path is merged to
checkfunc.
---
yum/__init__.py | 59 +-
1 files changed, 27 insertions(+), 32 deletions(-)
diff --git a/yum/_
Don't expect exception, handle errors in 'failfunc' callback.
If argument 'errors' is provided, call it with the error message.
---
yum/yumRepo.py | 51 +++
1 files changed, 31 insertions(+), 20 deletions(-)
diff --git a/yum/yumRepo.py b/yum/yumRe
URLGrabber has a built-in limit of the number of concurrent connections
to use when downloading files. Set this option to N >= 0 to change
that limit. Default is '-1' (use urlgrabber's default).
'0' has a special meaning: disable parallel download code so urlgrab,
even between parallel_begin() a
default_grabber.opts.parallel:
concurrent connections limit.
0 = always use blocking urlgrab()
parallel_begin():
start queueing grab requests
parallel_end():
process queue in parallel
---
urlgrabber/grabber.py | 171 +++
This callback is called when urlgrab request fails.
If grab is wrapped in a mirror group, only the mirror
group issues the callback.
---
urlgrabber/grabber.py | 20 +++-
urlgrabber/mirror.py | 11 ++-
2 files changed, 29 insertions(+), 2 deletions(-)
diff --git a/urlg
---
urlgrabber/grabber.py | 49 ++---
1 files changed, 26 insertions(+), 23 deletions(-)
diff --git a/urlgrabber/grabber.py b/urlgrabber/grabber.py
index f292565..303428c 100644
--- a/urlgrabber/grabber.py
+++ b/urlgrabber/grabber.py
@@ -1595,29 +1595
---
urlgrabber/grabber.py | 34 ++
1 files changed, 18 insertions(+), 16 deletions(-)
diff --git a/urlgrabber/grabber.py b/urlgrabber/grabber.py
index c099d3e..f292565 100644
--- a/urlgrabber/grabber.py
+++ b/urlgrabber/grabber.py
@@ -1575,22 +1575,7 @@ class PyC
Total percentages are calculated in re.fraction_read(), and are relative
to re.last_amount_read and re.total. Use it also for done/total sizes.
Keep total_size for compatibility.
---
urlgrabber/progress.py |2 +-
1 files changed, 1 insertions(+), 1 deletions(-)
diff --git a/urlgrabber/progre
---
urlgrabber/grabber.py |9 -
1 files changed, 8 insertions(+), 1 deletions(-)
diff --git a/urlgrabber/grabber.py b/urlgrabber/grabber.py
index f6f57bd..c099d3e 100644
--- a/urlgrabber/grabber.py
+++ b/urlgrabber/grabber.py
@@ -1313,8 +1313,15 @@ class PyCurlFileObject(object):
There may be exactly one update, then last_update_time == start_time.
---
urlgrabber/progress.py |2 +-
1 files changed, 1 insertions(+), 1 deletions(-)
diff --git a/urlgrabber/progress.py b/urlgrabber/progress.py
index 45eb248..f28b632 100644
--- a/urlgrabber/progress.py
+++ b/urlgrabber/pro
See also: TextMeter._do_update()
---
urlgrabber/progress.py |2 +-
1 files changed, 1 insertions(+), 1 deletions(-)
diff --git a/urlgrabber/progress.py b/urlgrabber/progress.py
index 2a6894f..1de0abd 100644
--- a/urlgrabber/progress.py
+++ b/urlgrabber/progress.py
@@ -503,7 +503,7 @@ class Te
Rate estimator is updated only partially in update_meter().
Make sure the downloaded size reported at the end includes
data read in last 0.3 seconds.
---
urlgrabber/progress.py |1 +
1 files changed, 1 insertions(+), 0 deletions(-)
diff --git a/urlgrabber/progress.py b/urlgrabber/progress.py
First, fix some issues with MultiFileMeter.
[PATCH 1/9] Prevent float division by zero
[PATCH 2/9] MultiFileMeter: show correct finished size
[PATCH 3/9] TextMultiFileMeter: use 'text' instead of 'basename'.
[PATCH 4/9] Use re.total instead of total_size.
Then, move parts of PyCurlFileObject code
On Mon, 2011-09-26 at 03:33 +0530, Tirtha Chatterjee wrote:
> Hi!
>
> I frequently face this problem with yum where i need to install a
> package, and it starts updating all the repo data at once. Since I am
> mostly behind a slow internet connection, this is all the more
> frustrating. I read up
> I conducted a few tests, and found that the diff size is much less in
> general when i perform the diff between xml files, than in case of
> sqlite files.
>
> Also, what is the advantage of using sqlite instead of xml (since I
> could not find this anywhere in the wiki)?
XML compresses and diff
On Mon, Sep 26, 2011 at 4:51 PM, Zdenek Pavlas wrote:
> Hi!
>
> Thanks for the interest in improving MD download!
> Just a few thoughts (I don't consider myself experienced
> in the codebase, esp on the createrepo part).
>
> - Sharding metadata is very likely not an option.
>
> The per-file overhe
Hi!
Thanks for the interest in improving MD download!
Just a few thoughts (I don't consider myself experienced
in the codebase, esp on the createrepo part).
- Sharding metadata is very likely not an option.
The per-file overhead (to download, to store, to query)
is significant, and to get a sig
> > > I meant that traditionally to have packages "move arch" we had
> > > them obsolete older versions of themselves in their specfiles.
> >
> > Oh, now I see. That shouldn't hurt, but is it necessary?
>
> I guess it's rather a "make this special case explicit" and not the
> default.
Ok, I get
18 matches
Mail list logo