Hi,
It looks like Intel's mpirun doesn't have '-machinefile' option. Instead
of this it has '-hostfile' option (form here:
http://downloadmirror.intel.com/18462/eng/nes_release_notes.txt).
Try 'mpirun -h' for information about options and apply appropriate.
Best regards,
Maxim Rakitin
Hi Maxim,
Thanks for your reply!
We tried MPIRUN=mpirun -np _NP_ -hostfile _HOSTS_ _EXEC_, but the problem
persists. The only difference is that stdout changes to ''? MPI: invalid option
-hostfile ?''.
Thanks,
Wei
On Oct 31, 2010, at 10:40 PM, Maxim Rakitin wrote:
Hi,
It looks like
Dear Wei,
Maybe -machinefile is ok for your mpirun. Which options are appropriate
for it? What does help say?
Try to restore your MPIRUN variable with -machinefile and rerun the
calculation. Then see what is in .machine0/1/2 files and let us know. It
should contain 8 lines of r1i0n0 node and
01 Nov 2010 02:56:47 Wei Xie wrote:
We encountered some problem when running in parallel (K-point, MPI or
both)--the calculations crashed at LAPW2. Note we had no problem running
it in serial.
This is a TiC example running
Dear Wei,
Isn't the error connected with spin-polarised -
Hi Maxim,
Thanks for the follow-up!
I think it should be -machinefile that's appropriate. Here's the help:
-machinefile # file mapping procs to machine
No -hostfile option mentioned for my current version of MPI in the help.
Yes, the machine0/1/2 files are exactly like what
Hi Wei,
The parallel_options file manages how parallel programs run, so change
the following line in it:
setenv WIEN_MPIRUN mpirun -np _NP_ -hostfile _HOSTS_ _EXEC_
to
setenv WIEN_MPIRUN mpirun -np _NP_ -machinefile _HOSTS_ _EXEC_
Your .machine0/1/2 files are correct,
Also I believe
Dear Lyudmila,
On Nov 1, 2010, at 8:36 AM, Lyudmila V. Dobysheva wrote:
01 Nov 2010 02:56:47 Wei Xie wrote:
We encountered some problem when running in parallel (K-point, MPI or
both)--the calculations crashed at LAPW2. Note we had no problem running
it in serial.
This is a TiC example
Dear all WIEN2k community members:
We encountered some problem when running in parallel (K-point, MPI or
both)--the calculations crashed at LAPW2. Note we had no problem running it in
serial. We have tried to diagnose the problem, recompile the code with
difference options and test with
8 matches
Mail list logo