Hi,

We have a computational cluster which is  consisting of 8 HP Proliant
ML370G5 with 32GB ram.
Each node has  a Melanox single port infiniband DDR HCA  card (20Gbit/s)
and connected each other through
a Voltaire ISR9024D-M DDR infiniband switch.

Now we want to increase the bandwidth to 40GBit/s adding second
infiniband cards to each node.

I want to ask if this is possible, if yes how?

Do I have to make a infiniband-bonding configuration or openmpi is
already able to use the  second card with doubling the  bandwidth?

Is there some one who employed such configurations??


Thanks.


Sefa Arslan

Reply via email to