Thanks; this workaround does allow it to complete its run.
On Tue, 25 Oct 2005 10:19:54 -0600, Galen M. Shipman
wrote:
Correction: HPL_NO_DATATYPE should be: HPL_NO_MPI_DATATYPE.
- Galen
On Oct 25, 2005, at 10:13 AM, Galen M. Shipman wrote:
Hi Troy,
Sorry for the delay, I am now able t
Correction: HPL_NO_DATATYPE should be: HPL_NO_MPI_DATATYPE.
- Galen
On Oct 25, 2005, at 10:13 AM, Galen M. Shipman wrote:
Hi Troy,
Sorry for the delay, I am now able to reproduce this behavior when I
do not specify HPL_NO_DATATYPE. If I do specify HPL_NO_DATATYPE the
run completes. We will be
Hi Troy,
Sorry for the delay, I am now able to reproduce this behavior when I
do not specify HPL_NO_DATATYPE. If I do specify HPL_NO_DATATYPE the
run completes. We will be looking into this now.
Thanks,
Galen
On Oct 21, 2005, at 5:03 PM, Troy Telford wrote:
I've been trying out the RC4
I've been trying out the RC4 builds of OpenMPI; I've been using Myrinet
(gm), Infiniband (mvapi), and TCP.
When running a benchmark such as IMB (formerly PALLAS, IIRC), or even a
simple hello world, there are no problems.
However, when running HPL (and HPCC, which is a superset of HPL), I h