<snip>
> I am not 100% certain, but this is the only change
> that I implemented that made the difference. It might explain why
> when both drives were installed that the system booted, but when
> either was removed, it would not boot.
>
> Using LBA translation method in both the BIOS and with fdisk using
> matching LBA geometries, I can now boot from either drive- the way
> raid1 is meant to be . . .
>
<snip>
This is an issue with the firmware on the drive itself. Intrestingly
enough, I have 2 identical (??) 20 gig drives purchased at the same
time that report differently to the kernel at boot. When probed, one
reports CHS and the other reports LBA values. I haven't checked in
detail, but I suspect that the firmware revision levels on the
disks are different. The disks will operate with either setting in
the motherboard bios, and will report those settings to fdisk.
However, the kernel wants to use the probe values or will not
consistently write the correct stuff to the disk. So.... one is set
up as CHS and the other as LBA -- both are in the same raid 1 set and
all works fine with those settings. It does not work fine when
otherwise.
Bottom line, it is important to note the probe values reported by the
kernel and to use those values to set the bios on the motherboard
since fdisk uses the bios values when creating the partitions and it
seems like the kernel uses the probe values read from the ide
interface. Maybe I haven't got it quite right, but this is as close
as I've managed without digging into the code.
-
To unsubscribe from this list: send the line "unsubscribe linux-raid" in
the body of a message to [EMAIL PROTECTED]