Hi,

I'm running snv_134 on 64-bit x86 motherboard, with 2 SATA drives. The zpool 
"rpool" uses whole disk of each drive. I've installed grub on both discs, and 
mirroring seems to be working great.

I just started testing what happens when a drive fails. I kicked off some 
activities and unplugged one of the drives while it was running, the system 
kept running, and zpool status indicated that one drive was removed. Awesome. I 
plugged it back in, and it recovered perfectly.

But with one of the drives unplugged, the system hangs at boot. On both drives 
(with the other unplugged) grub loads, and the system starts to boot. However, 
it gets stuck at the "Hostname: Vault" line and never gets to "reading ZFS 
config" like it would on a normal boot.

If I reconnect both drives then booting continues correctly.

If I detach a drive from the pool, then the system also correctly boots off a 
single connected drive. However, reattaching the 2nd drive causes a whole 
resilver to occur.

Is this a bug? Or is there some other thing you need to do to mark the drive as 
offline or something. Shame that you have to do that before rebooting! Would 
make it very hard to recover if the drive was physically dead....

Thanks,
Matt
-- 
This message posted from opensolaris.org
_______________________________________________
zfs-discuss mailing list
zfs-discuss@opensolaris.org
http://mail.opensolaris.org/mailman/listinfo/zfs-discuss

Reply via email to