Hi Robin,

   - disk fragmentation coupled with minimum file size allocation issues (?) ie 
lots of the same large file updated in place might lead to this scenario, I 
think? 

   - can you confirm trash is empty? is the trash cleanup daemon getting stuck 
and therefore nothing getting cleaned up?

      - is nightly cleanup showing that it actually cleaned stuff up? (I have 
"Nightly cleanup removed 31159 files of size 88.77GB (around 9/17 15:37)" or 
similiar on status)

   - how many full backups / incremental backups are listed on your "host 
summary" page ? 
  
   My pool is around 630GB and 3843035 files (so 1/5th the size and only half 
the number of files), but DF shows it to be 654GB on disk. I put the 
discrepancy down to the number of directories in my PC tree - directories don't 
hardlink, only files do. The list of files in a directory has to be stored 
somewhere, and each directory has a minimum allocation size on disk. 
   
//chris

----- Original Message -----
From: "Robin Lee Powell" <rlpow...@digitalkingdom.org>
To: "BackupPC Users" <backuppc-users@lists.sourceforge.net>
Sent: Friday, 17 September, 2010 1:07:25 AM
Subject: [BackupPC-users] Surprising disk usage, *not* a link problem (I don't 
think)

I have a fairly large (171 hosts) backup environment that seems to
be using rather more disk than it should.

GUI says: Pool is 3055.59GB comprising 7361233 files and 4369
directories (as of 9/16 01:33),

df says:

Filesystem Size Used Avail Use% Mounted on
/dev/mapper/local-backups 4.0T 3.8T 270G 94% /backups

which is a pretty significant gap. I've never moved anything
around, so it shouldn't be a linking problem.

The trash is empty.

Running "du -h --exclude='[0-9]*' --exclude=new" in the pc
directory, which *should* ignore only the directories that link to
the pool, gives "2.4G ."

Running a proper du of the whole tree would take A While, and I'm
not sure my RAM could survive the link checking along with all the
other stuff backuppc is doing.

Any ideas as to what's going on here?

Maybe relevant:

$Conf{BackupPCNightlyPeriod} = 8;

But that still seems excessive? I've run the nightly manually a few
times, as "sudo -u backuppc /usr/local/bin/BackupPC_nightly 0 255"
(is that the correct syntax for doing that?) and it's taken well
over 24 hours, so splitting it up seems like a probably good idea?

-Robin

--
http://singinst.org/ : Our last, best hope for a fantastic future.
Lojban (http://www.lojban.org/): The language in which "this parrot
is dead" is "ti poi spitaki cu morsi", but "this sentence is false"
is "na nei". My personal page: http://www.digitalkingdom.org/rlp/

------------------------------------------------------------------------------
Start uncovering the many advantages of virtual appliances
and start using them to simplify application deployment and
accelerate your shift to cloud computing.
http://p.sf.net/sfu/novell-sfdev2dev
_______________________________________________ BackupPC-users mailing
list BackupPC-users@lists.sourceforge.net
List: https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki: http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/

------------------------------------------------------------------------------
Start uncovering the many advantages of virtual appliances
and start using them to simplify application deployment and
accelerate your shift to cloud computing.
http://p.sf.net/sfu/novell-sfdev2dev
_______________________________________________
BackupPC-users mailing list
BackupPC-users@lists.sourceforge.net
List:    https://lists.sourceforge.net/lists/listinfo/backuppc-users
Wiki:    http://backuppc.wiki.sourceforge.net
Project: http://backuppc.sourceforge.net/

Reply via email to