> I manage a Subversion server that has the following configuration :
> - SVN 1.6.9
> - FSFS storage mode
> - Apache + mod_dav + subversion modules
> - Linux Suse Enterprise Edition 32-bit
> 
> On this SVN server, there are around 1100 SVN repositories for 
> around 2000 users. I have small repositories and also very heavy 
> repositories (the heaviest weighs around 33 GB on my linux 
> filesystem). The sum of my repositories weighs around 1TB.
> 
> Do you know if there is a size limitation for a SVN repository in 
Subversion?
> Do you know if there is a number limitation for SVN repositories on 
> a Subversion server? Does-it really decrease performances on the 
> subversion server?

This really depends upon the hardware and how the users are using
the server.  That said, the largest server I have has 1800
repositories serving around 6500 users.  The largest repository
is around 400GB with around 7TB of total storage.  The largest
single commit I have seen is around 53GB.

The larger repositories get, the longer it may take to do
maintenance activities such as verifying, filtering, dumping,
and loading a repository.  This is why I'd recommend staying
away from large repositories and large commits, but they do work.

Subversion seems to be I/O bound, even on a high-end SAN.  1.7
seems to definitely chew more CPU and memory though.  But, I've
also seen multiple 1GB NICs near saturation on the server too...

Things that can kill performance:
- Slow filesystem I/O
- Poorly written hook scripts
- Commits with large numbers of files (1M+)
- Lots of files locked (hundred of thousands+)
- Slow authentication servers

You could easily run into issues depending upon the filesystem
type and how you have organized the repositories.  For example,
one large partition *might* be less efficient.

Kevin R.

Reply via email to