Hello Somsak St.

        It is wonderful to see your reply, and the following is what I have 
done these days:
        Installed major S/W version information and H/W information:
        S/W:
(1). RHEL5 U4 and use "yum update" to update the installed rpm to up-to-date 
version:
                Original kernel: 2.6.18-164-el5 (x86_64)
                Up-to-date kernel: 2.6.18-194.17.1-el5 (x86_64)
                (2). HPDMmultipath: 4.4.1 (device mapper multipath enablement 
kit for hp storageworks disk arrays)
                (3). PSP (ProLiant Support Pack for Linux) 8.5 for RHEL5 
(x86_64)
                (4). qla2xxx.ko version: 8.03.01.04.05.05-k (FC HBA kernel 
module)
        H/W:
(1). CPU Intel Xeon (x2)
(2). 64GB Memory
(3). QLogic Fibre Channel HBA Driver: 8.03.01.04.05.05-k (x2)
(QLogic HPAE312A - PCI-Express Dual Port 4Gb Fibre Channel HBA)
                (4). Network interface:
                        Broadcom NetXtreme II Gigabit Ethernet Driver bnx2 
v2.0.2 (x1) ((4-port per each card)
                        Intel PRO/1000 Network Driver - 1.1.17-NAPI (x2) 
(4-port per each card)

1.      10/01:
a.      Install of RHEL5 U4 on two ProLiant servers, D380/G6.
b.      Configure bond0/bind1 for network interfaces, network interfaces are 
workable.
c.      Enable and run multipathd, use "multipath -l" to check the shared disks 
on EVA8400,
the shared disks are visible.
d.      Use "mkqdisk -c /dev/mapper/mpath3 -l QDISK" to build quorum disk, use 
"mkqdisk -L"
to view quorum disk on the two ProLiant servers, output is OK.
e.      Configure and test iLO fence, it is workable for the two ProLiant 
servers.
f.      Configure the basic cluster configuration for two-node and quorum disk, 
when using
"service cman start" on the two servers to start cluster, it is always "only one
node" can get quorum.
2.      10/02:
a.      Configure and use yum to "update" the installed rpms to up-to-date 
version and use
the new kernel for further tests, it is still "only one node" can get quorum.
b.      Jason changed the OS type from Windows to Linux for shared disk from 
EVA8400
c.      Add some qla2xxx options to /etc/modprobe.conf to test, it still fails 
to join
the two nodes into one workable cluster.
3.      10/03:
a.      Install "device mapper multipath enablement kit for hp storageworks 
disk arrays"
for more test, it still fails.
b.      Pull-out fiber-channel cable to simulate only one path to connect to 
EVA8400 for
more tests, it fails, too.
c.      Install "ProLiant Support Pack" for more test, it fails, too.

The following files and command outputs will be updated to you as soon as I can.
mkqdisk -L, multipath -ll, clustat -v command, /etc/cluster/cluster.conf and 
/etc/multipath.conf
Thanks a lot, and your kindly support is very appreciated.

Best Regards
Danny Lin


--
Linux-cluster mailing list
[email protected]
https://www.redhat.com/mailman/listinfo/linux-cluster

Reply via email to