Re: [OMPI users] openmpiu 1.3.3. with OpenFOAM

2009-09-02 Thread Jeff Squyres
I'll let the Myricom guys answer further for message passing optimizations, but some general tips: - Yes, using processor affinity might help. Add "--mca mpi_paffinity_alone 1" to your mpirun command line and see if that helps. - I *believe* that HP MPI uses processor affinity by default

[OMPI users] openmpiu 1.3.3. with OpenFOAM

2009-08-27 Thread bastil2...@yahoo.de
Dear openmpi-ers, I lately installed openmpi to run OpenFOAM 1.5 on our myrinet cluster. I saw great performence improvements compared to openmpi 1.2.6, however it is still little behind the commerical HPMPI. Are there further tipps for fine-tuning the parameters to run mpirun with for this