Thanks Philip, Magnus, Sam. There's no question that I have an outlier problem. 
But others must have similar, for
instance master video files.

 I need both remote archiving/retrieval and version control. FYI we are 
archiving compressed Linux disk images for VMs
and hypervisors. We are hardware-software makers and manufacturing blasts the 
disk images directly onto SSD drives. The
rest of the repo is a varied mix of far smaller binaries and text files.

running on a fast desktop it can take thousands of seconds to perform a status 
or commit, completely pegging one or more
procs.

On 05/15/2014 12:06 PM, Philip Oakley wrote:
> From: "John Fisher" <fishook2...@gmail.com>
>> I assert based on one piece of evidence ( a post from a facebook dev) that I 
>> now have the worlds biggest and slowest git
>> repository, ...
>
> At the moment some of the developers are looking to speed up some of the code 
> on very large repos, though I think they
> are looking at code repos, rather than large file repos. They were looking 
> for large repos to test some of the code
> upon ;-)
>
> I've copied the Git list should they want to make any suggestions.
> -- 
> Philip


-- 
You received this message because you are subscribed to the Google Groups "Git 
for human beings" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to git-users+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to