we are thinking about using BtrFS on standard hardware for a
fileserver with about 50T (100T raw) of storage (25×4TByte).
I would recommend carefully reading this thread titled: "1 week to
rebuid 4x 3TB raid10 is a long time!"
So I have a 2 x 2.6 TB devices in btrfs RAID-1, 716G used. Linux 3.16.
One of the disks failed.
"btrfs device delete missing /home" is taking 9 days so far, on an idle
system:
root 4828 0.3 0.0 17844 260 pts/1 D+ Aug11 38:18 btrfs
device delete missing /home
There is some kind of btrfs debug info printed in dmesg which seems to
tell me that the operation is working, like:
[744657.598810] BTRFS info (device sda4): relocating block group
908951814144 flags 17
[744672.021612] BTRFS info (device sda4): found 4784 extents
[744688.604997] BTRFS info (device sda4): found 4784 extents
[744689.133397] BTRFS info (device sda4): relocating block group
910025555968 flags 17
[744701.162678] BTRFS info (device sda4): found 4196 extents
[744725.000459] BTRFS info (device sda4): found 4196 extents
but other than that, the recovery time doesn't look optimistic to me,
there is no ability to check the progress etc.
--
Tomasz Chmielewski
http://www.sslrack.com
--
To unsubscribe from this list: send the line "unsubscribe linux-btrfs" in
the body of a message to majord...@vger.kernel.org
More majordomo info at http://vger.kernel.org/majordomo-info.html