We have a computational cluster which is  consisting of 8 HP Proliant
ML370G5 with 32GB ram.
Each node has  a Melanox single port infiniband DDR HCA  card (20Gbit/s)
and connected each other through
a Voltaire ISR9024D-M DDR infiniband switch.

Now we want to increase the bandwidth to 40GBit/s adding second
infiniband cards to each node.

I want to ask if this is possible, if yes how?
You need to check if it possible to add one more Infiniband card to
your motherboard. As well you need verify that you PCI-EX link and the chipset will allow to utilize resources of 2 HCAs. You may temporary take 2 hca from some of your machines and add them to another pair machines. It will allow to you make some benchmarking with 2 hcas.

From driver and OpenMPI perspective 2 (and more) hca configuration is supported by default.

Pasha.

Reply via email to