Michael Haggerty <mhag...@alum.mit.edu> writes:

> It would be pretty annoying to spend a lot of time fetching a big pack,
> only to have the fetch fail because one reference out of many couldn't
> be updated.  This would force the user to download the entire pack
> again,...

Is that really true?  Doesn't quickfetch optimization kick in for
the second fetch?
--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majord...@vger.kernel.org
More majordomo info at  http://vger.kernel.org/majordomo-info.html

Reply via email to