Shi,
Justin straightened me out regarding the command structure ;  Used  "mpirun -np 
8 gmx_mpi mdrun -deffnm  Run_file.nvt”

But for time being I’ve given up on two GPUs with the 32 core system.  I am now 
just trying to make the single GPU work well.

Paul

> On Dec 19, 2018, at 5:51 AM, Shi Li <sli...@g.uky.edu> wrote:
> 
>> 
>> 
>> ------------------------------
>> 
>> Message: 3
>> Date: Tue, 18 Dec 2018 21:51:41 -0600
>> From: paul buscemi <pbusc...@q.com>
>> To: "gmx-us...@gromacs.org" <gmx-us...@gromacs.org>
>> Subject: Re: [gmx-users] error on opening gmx_mpi
>> Message-ID: <ff9159e0-345c-4266-900a-bad9ba514...@q.com>
>> Content-Type: text/plain;    charset=utf-8
>> 
>> Shi,  Thanks fo the note
>> 
>> Yes, somehow - there is a version of gromacs 5 that is being summoned.  I?ve 
>> got clean up my act a bit.  
>> 
>> A suggestion was made to try to use the mpi version because of the CPU I am 
>> using.   gmx v18.3 was installed , but  I  removed its build  and built the 
>> 19.1 beta mpi  version in a separate directory.  Apparently there are  some 
>> remnants being  called.  But  v 5 has never been installed on this 
>> particular computer, so I have no idea were gromacs -5.1.2 is coming from.   
>> it may be easier purge everything and start again.  
>> 
>> Paul
> 
> Another way to solve this is to install the new version of GROMACS on a 
> prefix directory, instead of using the default. Then make an individual file 
> to source the new GMXRC in the prefixed directory as well as load all the 
> modules you used to install the program. So that you won’t have the problem 
> to confuse with different versions on your computer/cluster. 
> 
> Shi 
>> 
>>> On Dec 18, 2018, at 8:48 PM, Shi Li <sli...@g.uky.edu> wrote:
>>> 
>>>> 
>>>> Message: 3
>>>> Date: Tue, 18 Dec 2018 15:12:00 -0600
>>>> From: p buscemi <pbusc...@q.com>
>>>> To: "=?utf-8?Q?gmx-users=40gromacs.org?=" <gmx-us...@gromacs.org>
>>>> Subject: [gmx-users] error on opening gmx_mpi
>>>> Message-ID:
>>>>    <1545164001.local-b6243977-9380-v1.5.3-420ce...@getmailspring.com>
>>>> Content-Type: text/plain; charset="utf-8"
>>>> 
>>>> I installed 2019 beata gmx_mpi with:
>>>> cmake .. -DGMX_BUILD_OWN_FFTW=ON -DREGRESSIONTEST_DOWNLOAD=ON -DGMX_GPU=on 
>>>> -DCMAKE_CXX_COMPILER=/usr/bin/g++-7 -DCMAKE_C_COMPILER=/usr/bin/gcc-7 
>>>> -DGMX_MPI=ON -DGMX_USE_OPENCL=ON
>>>> 
>>>> The install completed with no errors.
>>>> I need to take this step by step: in running minim. For minimization I used
>>>> mpirun -np 8 mdrun_mpi -deffnm RUNname.em
>>>> with the output:
>>>> :-) GROMACS - mdrun_mpi, VERSION 5.1.2 (-:
>>>> etc etc....
>>>> GROMACS: mdrun_mpi, VERSION 5.1.2
>>>> Executable: /usr/bin/mdrun_mpi.openmpi
>>>> Data prefix: /usr
>>> 
>>> It looked like you didn?t run the new installed GROMACS. What is the output 
>>> when you input gmx_mpi? It should be version 2018 instead of 5.1.2. 
>>> Have you put the gromacs in your PATH or source the GMXRC?
>>> 
>>> Shi
>>> 
>>> 
>>>> Command line:
>>>> mdrun_mpi -deffnm PVP20k1.em
>>>> 
>>>> Back Off! I just backed up PVP20k1.em.log to ./#PVP20k1.em.log.2#
>>>> Running on 1 node with total 64 cores, 64 logical cores
>>>> Hardware detected on host rgb2 (the node of MPI rank 0):
>>>> CPU info:
>>>> Vendor: AuthenticAMD
>>>> Brand: AMD Ryzen Threadripper 2990WX 32-Core Processor
>>>> SIMD instructions most likely to fit this hardware: AVX_128_FMA
>>>> SIMD instructions selected at GROMACS compile time: SSE2
>>>> 
>>>> Compiled SIMD instructions: SSE2, GROMACS could use AVX_128_FMA on this 
>>>> machine, which is better
>>>> Reading file PVP20k1.em.tpr, VERSION 2018.4 (single precision)
>>>> -------------------------------------------------------
>>>> Program mdrun_mpi, VERSION 5.1.2
>>>> Source code file: 
>>>> /build/gromacs-z6bPBg/gromacs-5.1.2/src/gromacs/fileio/tpxio.c, line: 3345
>>>> 
>>>> Fatal error:
>>>> reading tpx file (PVP20k1.em.tpr) version 112 with version 103 program
>>>> For more information and tips for troubleshooting, please check the GROMACS
>>>> website at http://www.gromacs.org/Documentation/Errors
>>>> -------------------------------------------------------
>>>> 
>>>> Halting parallel program mdrun_mpi on rank 0 out of 8
>>>> --------------------------------------------------------------------------
>>>> MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
>>>> with errorcode 1.
>>>> 
>>>> NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
>>>> You may or may not see output from other processes, depending on
>>>> exactly when Open MPI kills them.
>>>> ====
>>>> I see the fatal error but minim.ndp was used while in gmx_mpi - this is 
>>>> not covered in commor errors.
>>>> and I see the note on AVX_128_FM.. but that can wait. Is it the version of 
>>>> the MPI files ( 103 ) that is the at fault?
>>>> 
>>>> I need to create the proper tpr to continue
>>>> 
>>>> 
>>>> ------------------------------
>>>> 
>>> 
>>> -- 
>>> Gromacs Users mailing list
>>> 
>>> * Please search the archive at 
>>> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!
>>> 
>>> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
>>> 
>>> * For (un)subscribe requests visit
>>> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send 
>>> a mail to gmx-users-requ...@gromacs.org.
>> 
>> 
>> 
> 
> -- 
> Gromacs Users mailing list
> 
> * Please search the archive at 
> http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!
> 
> * Can't post? Read http://www.gromacs.org/Support/Mailing_Lists
> 
> * For (un)subscribe requests visit
> https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
> mail to gmx-users-requ...@gromacs.org.

-- 
Gromacs Users mailing list

* Please search the archive at 
http://www.gromacs.org/Support/Mailing_Lists/GMX-Users_List before posting!

* Can't post? Read http://www.gromacs.org/Support/Mailing_Lists

* For (un)subscribe requests visit
https://maillist.sys.kth.se/mailman/listinfo/gromacs.org_gmx-users or send a 
mail to gmx-users-requ...@gromacs.org.

Reply via email to