Erik Meitner writes:

> Craig Barratt wrote:
> > Erik Meitner writes:
> >
> >   
> >> Hi. We are running BackupPC V2.11(Debian 2.1.1-2sarge1). The BackupPc 
> >> pool is on a 600 GB ext3 partition. For one of our users who has a 
> >> fairly deep directory structure we get a lot of "unable to link" errors 
> >> (see end of message). The files in the pool are not at the hard link 
> >> limit yet:
> >>
> >> # ls -l|sort -n -k 2  | tail
> >> -rw-r-----  25795 backuppc backuppc     22 Jan  7 20:24 fsostrsh.fpt
> >> -rw-r-----  25795 backuppc backuppc     22 Jan  7 20:24 fsostrs.fpt
> >> -rw-r-----  25795 backuppc backuppc     22 Jan  7 20:24 fsosordh.fpt
> >> -rw-r-----  25795 backuppc backuppc     22 Jan  7 20:24 fsosord.fpt
> >> -rw-r-----  25795 backuppc backuppc     22 Jan  7 20:24 fprempy.fpt
> >> -rw-r-----  25795 backuppc backuppc     22 Jan  7 20:24 fprchck.fpt
> >> -rw-r-----  25795 backuppc backuppc     22 Jan  7 20:24 fposrmk.fpt
> >> -rw-r-----  25795 backuppc backuppc     22 Jan  7 20:24 fpoptrsh.fpt
> >> -rw-r-----  25795 backuppc backuppc     22 Jan  7 20:24 fpoptrs.fpt
> >> -rw-r-----  25795 backuppc backuppc     22 Jan  7 20:24 fpopordh.fpt
> >>
> >> Is this a problem with the files being 14 levels deep?
> >>     
> >
> > The path names don't look too long.  As you point out, the
> > per-file hardlink limit is not an issue - 22 links is way
> > less than the typical limit of 32000.
> >
> > Is your file system out of inodes?  Use df -i to check.
> >
> > Craig

> Oops. Mixed up the hard links and file size. Hard links are at 25795.
> Still below 32000 though.
> Inodes are ok too:
> 
> [EMAIL PROTECTED]:~$ df -i /bigvol/
> Filesystem            Inodes   IUsed   IFree IUse% Mounted on
> /dev/sda7            72181152 1804806 70376346    3% /bigvol

Ah, yes.  The hardlink limit likely is the problem.  The 25795 value
probably is after the previous backup is expired/removed.  It is
likely that you get to 32000 during the new backup before the
old backup is expired.

In CVS 3.0 there is a fix for this problem: if the hardlink limit
is reached it creates a new pool file.  In 2.x it misses certain
cases where this is needed, hence the error.

Craig


-------------------------------------------------------
This SF.net email is sponsored by: Splunk Inc. Do you grep through log files
for problems?  Stop!  Download the new AJAX search engine that makes
searching your log files as easy as surfing the  web.  DOWNLOAD SPLUNK!
http://sel.as-us.falkag.net/sel?cmd=lnk&kid=103432&bid=230486&dat=121642
_______________________________________________
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/backuppc-users
http://backuppc.sourceforge.net/

Reply via email to