On 07/15/2014 09:43 AM, Woody Wu wrote:
> I have tried some methods introduced in the network, but always
> failed.  Some big files committed by me to a very old branch then the
> files deleted and new branches were continuously created. Now the
> checkout directory has grown to about 80 megabytes.  What's the right
> way to permenently erase those garbage big files?

You probably need to use "git filter-branch" or maybe BFG
(http://rtyley.github.io/bfg-repo-cleaner/) to rewrite history as if the
big files had never been committed.  But beware of the warnings about
rewriting history--for example, any collaborators will have to rebase
their branches onto the new history.

Michael

-- 
Michael Haggerty
mhag...@alum.mit.edu
--
To unsubscribe from this list: send the line "unsubscribe git" in
the body of a message to majord...@vger.kernel.org
More majordomo info at  http://vger.kernel.org/majordomo-info.html

Reply via email to