Re: [OMPI devel] Open MPI 1.8.6 memory leak

2015-07-10 Thread Nick Papior
Just want to confirm.

I also see this memory leak on 1.8.6, using 1.8.7rc1 fixes this memory leak.


2015-07-02 4:02 GMT+02:00 Gilles Gouaillardet :

>  Nathan,
>
> the root cause is your fixes were not backported to the v1.8 (nor the
> v1.10) branch
>
> i made PR https://github.com/open-mpi/ompi-release/pull/357 to fix this.
>
> could you please review it ?
>
> since there are quite a lot of differences between v1.8 and master, the
> backport was not trivial.
> i left some #if 0 in the code since i do not know if something need to be
> done about rdma fragments
>
> Cheers,
>
> Gilles
>
>
> On 7/2/2015 6:04 AM, Nathan Hjelm wrote:
>
> Don't see the leak on master with OS X using the leaks command. Will see
> what valgrind finds on linux.
>
> -Nathan
>
> On Wed, Jul 01, 2015 at 08:48:57PM +, Rolf vandeVaart wrote:
>
> There have been two reports on the user list about memory leaks.  I have
>reproduced this leak with LAMMPS.  Note that this has nothing to do with
>CUDA-aware features.  The steps that Stefan has provided make it easy to
>reproduce.
>
>
>
>Here are some more specific steps to reproduce derived from Stefan.
>
>
>
>1. clone LAMMPS (git clone git://git.lammps.org/lammps-ro.git lammps)
>2. cd src/, compile with openMPI 1.8.6.  To do this, set your path to Open
>MPI and type "make mpi"
>3. run the example listed in lammps/examples/melt. To do this, first copy
>"lmp_mpi" from the src directory into the melt directory.  Then you need
>to modify the in.melt file so that it will run for a while.  Change
>"run 25" to "run25"
>
>4. you can run by mpirun -np 2 lmp_mpi < in.melt
>
>
>
>For reference, here is both 1.8.5 and 1.8.6 memory consumption.  1.8.5
>stays very stable where 1.8.6 almost triples after 6 minutes of running.
>
>
>
>Open MPI 1.8.5
>
>
>
>USER   PID %CPU %MEMVSZ   RSS TTY  STAT START   TIME COMMAND
>3234126907 59.0  0.0 329672 14584 pts/16   Rl   16:24   0:00
>./lmp_mpi_185_nocuda
>3234126908 60.0  0.0 329672 14676 pts/16   Rl   16:24   0:00
>./lmp_mpi_185_nocuda
>USER   PID %CPU %MEMVSZ   RSS TTY  STAT START   TIME COMMAND
>3234126907 98.3  0.0 329672 14932 pts/16   Rl   16:24   0:30
>./lmp_mpi_185_nocuda
>3234126908 98.5  0.0 329672 14932 pts/16   Rl   16:24   0:30
>./lmp_mpi_185_nocuda
>USER   PID %CPU %MEMVSZ   RSS TTY  STAT START   TIME COMMAND
>3234126907 98.9  0.0 329672 14960 pts/16   Rl   16:24   1:00
>./lmp_mpi_185_nocuda
>3234126908 99.1  0.0 329672 14952 pts/16   Rl   16:24   1:00
>./lmp_mpi_185_nocuda
>USER   PID %CPU %MEMVSZ   RSS TTY  STAT START   TIME COMMAND
>3234126907 99.1  0.0 329672 14960 pts/16   Rl   16:24   1:30
>./lmp_mpi_185_nocuda
>3234126908 99.3  0.0 329672 14952 pts/16   Rl   16:24   1:30
>./lmp_mpi_185_nocuda
>USER   PID %CPU %MEMVSZ   RSS TTY  STAT START   TIME COMMAND
>3234126907 99.2  0.0 329672 14960 pts/16   Rl   16:24   2:00
>./lmp_mpi_185_nocuda
>3234126908 99.4  0.0 329672 14952 pts/16   Rl   16:24   2:00
>./lmp_mpi_185_nocuda
>USER   PID %CPU %MEMVSZ   RSS TTY  STAT START   TIME COMMAND
>3234126907 99.3  0.0 329672 14960 pts/16   Rl   16:24   2:30
>./lmp_mpi_185_nocuda
>3234126908 99.5  0.0 329672 14952 pts/16   Rl   16:24   2:30
>./lmp_mpi_185_nocuda
>USER   PID %CPU %MEMVSZ   RSS TTY  STAT START   TIME COMMAND
>3234126907 99.4  0.0 329672 14960 pts/16   Rl   16:24   2:59
>./lmp_mpi_185_nocuda
>3234126908 99.5  0.0 329672 14952 pts/16   Rl   16:24   3:00
>./lmp_mpi_185_nocuda
>USER   PID %CPU %MEMVSZ   RSS TTY  STAT START   TIME COMMAND
>3234126907 99.4  0.0 329672 14960 pts/16   Rl   16:24   3:29
>./lmp_mpi_185_nocuda
>3234126908 99.6  0.0 329672 14956 pts/16   Rl   16:24   3:30
>./lmp_mpi_185_nocuda
>USER   PID %CPU %MEMVSZ   RSS TTY  STAT START   TIME COMMAND
>3234126907 99.4  0.0 329672 14960 pts/16   Rl   16:24   3:59
>./lmp_mpi_185_nocuda
>3234126908 99.6  0.0 329672 14956 pts/16   Rl   16:24   4:00
>./lmp_mpi_185_nocuda
>USER   PID %CPU %MEMVSZ   RSS TTY  STAT START   TIME COMMAND
>3234126907 99.4  0.0 329672 14960 pts/16   Rl   16:24   4:29
>./lmp_mpi_185_nocuda
>3234126908 99.6  0.0 329672 14956 pts/16   Rl   16:24   4:30
>./lmp_mpi_185_nocuda
>USER   PID %CPU %MEMVSZ   RSS TTY  STAT START   TIME COMMAND
>3234126907 99.5  0.0 329672 14960 pts/16   Rl   16:24   4:59
>./lmp_mpi_185_nocuda
>3234126908 99.6  0.0 329672 14956 pts/16   Rl   16:24   5:00
>./lmp_mpi_185_nocuda
>USER   PID %CPU %MEMVSZ   RSS TTY  STAT START   TIME COMMAND
>3234126907 99.5  0.0 329672 14960 pts/16   Rl   16:24   

Re: [OMPI devel] Open MPI 1.8.6 memory leak

2015-07-01 Thread Gilles Gouaillardet

Nathan,

the root cause is your fixes were not backported to the v1.8 (nor the 
v1.10) branch


i made PR https://github.com/open-mpi/ompi-release/pull/357 to fix this.

could you please review it ?

since there are quite a lot of differences between v1.8 and master, the 
backport was not trivial.
i left some #if 0 in the code since i do not know if something need to 
be done about rdma fragments


Cheers,

Gilles

On 7/2/2015 6:04 AM, Nathan Hjelm wrote:

Don't see the leak on master with OS X using the leaks command. Will see
what valgrind finds on linux.

-Nathan

On Wed, Jul 01, 2015 at 08:48:57PM +, Rolf vandeVaart wrote:

There have been two reports on the user list about memory leaks.  I have
reproduced this leak with LAMMPS.  Note that this has nothing to do with
CUDA-aware features.  The steps that Stefan has provided make it easy to
reproduce.

 


Here are some more specific steps to reproduce derived from Stefan.

 


1. clone LAMMPS (git clone git://git.lammps.org/lammps-ro.git lammps)
2. cd src/, compile with openMPI 1.8.6.  To do this, set your path to Open
MPI and type "make mpi"
3. run the example listed in lammps/examples/melt. To do this, first copy
"lmp_mpi" from the src directory into the melt directory.  Then you need
to modify the in.melt file so that it will run for a while.  Change
"run 25" to "run25"

4. you can run by mpirun -np 2 lmp_mpi < in.melt

 


For reference, here is both 1.8.5 and 1.8.6 memory consumption.  1.8.5
stays very stable where 1.8.6 almost triples after 6 minutes of running.

 


Open MPI 1.8.5

 


USER   PID %CPU %MEMVSZ   RSS TTY  STAT START   TIME COMMAND
3234126907 59.0  0.0 329672 14584 pts/16   Rl   16:24   0:00
./lmp_mpi_185_nocuda
3234126908 60.0  0.0 329672 14676 pts/16   Rl   16:24   0:00
./lmp_mpi_185_nocuda
USER   PID %CPU %MEMVSZ   RSS TTY  STAT START   TIME COMMAND
3234126907 98.3  0.0 329672 14932 pts/16   Rl   16:24   0:30
./lmp_mpi_185_nocuda
3234126908 98.5  0.0 329672 14932 pts/16   Rl   16:24   0:30
./lmp_mpi_185_nocuda
USER   PID %CPU %MEMVSZ   RSS TTY  STAT START   TIME COMMAND
3234126907 98.9  0.0 329672 14960 pts/16   Rl   16:24   1:00
./lmp_mpi_185_nocuda
3234126908 99.1  0.0 329672 14952 pts/16   Rl   16:24   1:00
./lmp_mpi_185_nocuda
USER   PID %CPU %MEMVSZ   RSS TTY  STAT START   TIME COMMAND
3234126907 99.1  0.0 329672 14960 pts/16   Rl   16:24   1:30
./lmp_mpi_185_nocuda
3234126908 99.3  0.0 329672 14952 pts/16   Rl   16:24   1:30
./lmp_mpi_185_nocuda
USER   PID %CPU %MEMVSZ   RSS TTY  STAT START   TIME COMMAND
3234126907 99.2  0.0 329672 14960 pts/16   Rl   16:24   2:00
./lmp_mpi_185_nocuda
3234126908 99.4  0.0 329672 14952 pts/16   Rl   16:24   2:00
./lmp_mpi_185_nocuda
USER   PID %CPU %MEMVSZ   RSS TTY  STAT START   TIME COMMAND
3234126907 99.3  0.0 329672 14960 pts/16   Rl   16:24   2:30
./lmp_mpi_185_nocuda
3234126908 99.5  0.0 329672 14952 pts/16   Rl   16:24   2:30
./lmp_mpi_185_nocuda
USER   PID %CPU %MEMVSZ   RSS TTY  STAT START   TIME COMMAND
3234126907 99.4  0.0 329672 14960 pts/16   Rl   16:24   2:59
./lmp_mpi_185_nocuda
3234126908 99.5  0.0 329672 14952 pts/16   Rl   16:24   3:00
./lmp_mpi_185_nocuda
USER   PID %CPU %MEMVSZ   RSS TTY  STAT START   TIME COMMAND
3234126907 99.4  0.0 329672 14960 pts/16   Rl   16:24   3:29
./lmp_mpi_185_nocuda
3234126908 99.6  0.0 329672 14956 pts/16   Rl   16:24   3:30
./lmp_mpi_185_nocuda
USER   PID %CPU %MEMVSZ   RSS TTY  STAT START   TIME COMMAND
3234126907 99.4  0.0 329672 14960 pts/16   Rl   16:24   3:59
./lmp_mpi_185_nocuda
3234126908 99.6  0.0 329672 14956 pts/16   Rl   16:24   4:00
./lmp_mpi_185_nocuda
USER   PID %CPU %MEMVSZ   RSS TTY  STAT START   TIME COMMAND
3234126907 99.4  0.0 329672 14960 pts/16   Rl   16:24   4:29
./lmp_mpi_185_nocuda
3234126908 99.6  0.0 329672 14956 pts/16   Rl   16:24   4:30
./lmp_mpi_185_nocuda
USER   PID %CPU %MEMVSZ   RSS TTY  STAT START   TIME COMMAND
3234126907 99.5  0.0 329672 14960 pts/16   Rl   16:24   4:59
./lmp_mpi_185_nocuda
3234126908 99.6  0.0 329672 14956 pts/16   Rl   16:24   5:00
./lmp_mpi_185_nocuda
USER   PID %CPU %MEMVSZ   RSS TTY  STAT START   TIME COMMAND
3234126907 99.5  0.0 329672 14960 pts/16   Rl   16:24   5:29
./lmp_mpi_185_nocuda
3234126908 99.6  0.0 329672 14956 pts/16   Rl   16:24   5:29
./lmp_mpi_185_nocuda
USER   PID %CPU %MEMVSZ   RSS TTY  STAT START   TIME COMMAND
3234126907 99.5  0.0 329672 14960 pts/16   Rl   16:24   5:59

Re: [OMPI devel] Open MPI 1.8.6 memory leak

2015-07-01 Thread Nathan Hjelm

Don't see the leak on master with OS X using the leaks command. Will see
what valgrind finds on linux.

-Nathan

On Wed, Jul 01, 2015 at 08:48:57PM +, Rolf vandeVaart wrote:
>There have been two reports on the user list about memory leaks.  I have
>reproduced this leak with LAMMPS.  Note that this has nothing to do with
>CUDA-aware features.  The steps that Stefan has provided make it easy to
>reproduce.
> 
> 
> 
>Here are some more specific steps to reproduce derived from Stefan.
> 
> 
> 
>1. clone LAMMPS (git clone git://git.lammps.org/lammps-ro.git lammps)
>2. cd src/, compile with openMPI 1.8.6.  To do this, set your path to Open
>MPI and type "make mpi"
>3. run the example listed in lammps/examples/melt. To do this, first copy
>"lmp_mpi" from the src directory into the melt directory.  Then you need
>to modify the in.melt file so that it will run for a while.  Change
>"run 25" to "run25"
> 
>4. you can run by mpirun -np 2 lmp_mpi < in.melt
> 
> 
> 
>For reference, here is both 1.8.5 and 1.8.6 memory consumption.  1.8.5
>stays very stable where 1.8.6 almost triples after 6 minutes of running.
> 
> 
> 
>Open MPI 1.8.5
> 
> 
> 
>USER   PID %CPU %MEMVSZ   RSS TTY  STAT START   TIME COMMAND
>3234126907 59.0  0.0 329672 14584 pts/16   Rl   16:24   0:00
>./lmp_mpi_185_nocuda
>3234126908 60.0  0.0 329672 14676 pts/16   Rl   16:24   0:00
>./lmp_mpi_185_nocuda
>USER   PID %CPU %MEMVSZ   RSS TTY  STAT START   TIME COMMAND
>3234126907 98.3  0.0 329672 14932 pts/16   Rl   16:24   0:30
>./lmp_mpi_185_nocuda
>3234126908 98.5  0.0 329672 14932 pts/16   Rl   16:24   0:30
>./lmp_mpi_185_nocuda
>USER   PID %CPU %MEMVSZ   RSS TTY  STAT START   TIME COMMAND
>3234126907 98.9  0.0 329672 14960 pts/16   Rl   16:24   1:00
>./lmp_mpi_185_nocuda
>3234126908 99.1  0.0 329672 14952 pts/16   Rl   16:24   1:00
>./lmp_mpi_185_nocuda
>USER   PID %CPU %MEMVSZ   RSS TTY  STAT START   TIME COMMAND
>3234126907 99.1  0.0 329672 14960 pts/16   Rl   16:24   1:30
>./lmp_mpi_185_nocuda
>3234126908 99.3  0.0 329672 14952 pts/16   Rl   16:24   1:30
>./lmp_mpi_185_nocuda
>USER   PID %CPU %MEMVSZ   RSS TTY  STAT START   TIME COMMAND
>3234126907 99.2  0.0 329672 14960 pts/16   Rl   16:24   2:00
>./lmp_mpi_185_nocuda
>3234126908 99.4  0.0 329672 14952 pts/16   Rl   16:24   2:00
>./lmp_mpi_185_nocuda
>USER   PID %CPU %MEMVSZ   RSS TTY  STAT START   TIME COMMAND
>3234126907 99.3  0.0 329672 14960 pts/16   Rl   16:24   2:30
>./lmp_mpi_185_nocuda
>3234126908 99.5  0.0 329672 14952 pts/16   Rl   16:24   2:30
>./lmp_mpi_185_nocuda
>USER   PID %CPU %MEMVSZ   RSS TTY  STAT START   TIME COMMAND
>3234126907 99.4  0.0 329672 14960 pts/16   Rl   16:24   2:59
>./lmp_mpi_185_nocuda
>3234126908 99.5  0.0 329672 14952 pts/16   Rl   16:24   3:00
>./lmp_mpi_185_nocuda
>USER   PID %CPU %MEMVSZ   RSS TTY  STAT START   TIME COMMAND
>3234126907 99.4  0.0 329672 14960 pts/16   Rl   16:24   3:29
>./lmp_mpi_185_nocuda
>3234126908 99.6  0.0 329672 14956 pts/16   Rl   16:24   3:30
>./lmp_mpi_185_nocuda
>USER   PID %CPU %MEMVSZ   RSS TTY  STAT START   TIME COMMAND
>3234126907 99.4  0.0 329672 14960 pts/16   Rl   16:24   3:59
>./lmp_mpi_185_nocuda
>3234126908 99.6  0.0 329672 14956 pts/16   Rl   16:24   4:00
>./lmp_mpi_185_nocuda
>USER   PID %CPU %MEMVSZ   RSS TTY  STAT START   TIME COMMAND
>3234126907 99.4  0.0 329672 14960 pts/16   Rl   16:24   4:29
>./lmp_mpi_185_nocuda
>3234126908 99.6  0.0 329672 14956 pts/16   Rl   16:24   4:30
>./lmp_mpi_185_nocuda
>USER   PID %CPU %MEMVSZ   RSS TTY  STAT START   TIME COMMAND
>3234126907 99.5  0.0 329672 14960 pts/16   Rl   16:24   4:59
>./lmp_mpi_185_nocuda
>3234126908 99.6  0.0 329672 14956 pts/16   Rl   16:24   5:00
>./lmp_mpi_185_nocuda
>USER   PID %CPU %MEMVSZ   RSS TTY  STAT START   TIME COMMAND
>3234126907 99.5  0.0 329672 14960 pts/16   Rl   16:24   5:29
>./lmp_mpi_185_nocuda
>3234126908 99.6  0.0 329672 14956 pts/16   Rl   16:24   5:29
>./lmp_mpi_185_nocuda
>USER   PID %CPU %MEMVSZ   RSS TTY  STAT START   TIME COMMAND
>3234126907 99.5  0.0 329672 14960 pts/16   Rl   16:24   5:59
>./lmp_mpi_185_nocuda
>3234126908 99.6  0.0 329672 14956 pts/16   Rl   16:24   5:59
>./lmp_mpi_185_nocuda
>USER   PID %CPU %MEMVSZ   RSS TTY  STAT START   TIME COMMAND
> 
> 
> 
>Open MPI 1.8.6
> 
> 
> 
>USER   PID %CPU %MEMVSZ   RSS TTY  STAT START   TIME COMMAND
>3234126755  0.0  0.0 330288 

[OMPI devel] Open MPI 1.8.6 memory leak

2015-07-01 Thread Rolf vandeVaart
There have been two reports on the user list about memory leaks.  I have 
reproduced this leak with LAMMPS.  Note that this has nothing to do with 
CUDA-aware features.  The steps that Stefan has provided make it easy to 
reproduce.

Here are some more specific steps to reproduce derived from Stefan.

1. clone LAMMPS (git clone 
git://git.lammps.org/lammps-ro.git lammps)
2. cd src/, compile with openMPI 1.8.6.  To do this, set your path to Open MPI 
and type "make mpi"
3. run the example listed in lammps/examples/melt. To do this, first copy 
"lmp_mpi" from the src directory into the melt directory.  Then you need to 
modify the in.melt file so that it will run for a while.  Change "run 25" to 
"run25"
4. you can run by mpirun -np 2 lmp_mpi < in.melt

For reference, here is both 1.8.5 and 1.8.6 memory consumption.  1.8.5 stays 
very stable where 1.8.6 almost triples after 6 minutes of running.

Open MPI 1.8.5

USER   PID %CPU %MEMVSZ   RSS TTY  STAT START   TIME COMMAND
3234126907 59.0  0.0 329672 14584 pts/16   Rl   16:24   0:00 
./lmp_mpi_185_nocuda
3234126908 60.0  0.0 329672 14676 pts/16   Rl   16:24   0:00 
./lmp_mpi_185_nocuda
USER   PID %CPU %MEMVSZ   RSS TTY  STAT START   TIME COMMAND
3234126907 98.3  0.0 329672 14932 pts/16   Rl   16:24   0:30 
./lmp_mpi_185_nocuda
3234126908 98.5  0.0 329672 14932 pts/16   Rl   16:24   0:30 
./lmp_mpi_185_nocuda
USER   PID %CPU %MEMVSZ   RSS TTY  STAT START   TIME COMMAND
3234126907 98.9  0.0 329672 14960 pts/16   Rl   16:24   1:00 
./lmp_mpi_185_nocuda
3234126908 99.1  0.0 329672 14952 pts/16   Rl   16:24   1:00 
./lmp_mpi_185_nocuda
USER   PID %CPU %MEMVSZ   RSS TTY  STAT START   TIME COMMAND
3234126907 99.1  0.0 329672 14960 pts/16   Rl   16:24   1:30 
./lmp_mpi_185_nocuda
3234126908 99.3  0.0 329672 14952 pts/16   Rl   16:24   1:30 
./lmp_mpi_185_nocuda
USER   PID %CPU %MEMVSZ   RSS TTY  STAT START   TIME COMMAND
3234126907 99.2  0.0 329672 14960 pts/16   Rl   16:24   2:00 
./lmp_mpi_185_nocuda
3234126908 99.4  0.0 329672 14952 pts/16   Rl   16:24   2:00 
./lmp_mpi_185_nocuda
USER   PID %CPU %MEMVSZ   RSS TTY  STAT START   TIME COMMAND
3234126907 99.3  0.0 329672 14960 pts/16   Rl   16:24   2:30 
./lmp_mpi_185_nocuda
3234126908 99.5  0.0 329672 14952 pts/16   Rl   16:24   2:30 
./lmp_mpi_185_nocuda
USER   PID %CPU %MEMVSZ   RSS TTY  STAT START   TIME COMMAND
3234126907 99.4  0.0 329672 14960 pts/16   Rl   16:24   2:59 
./lmp_mpi_185_nocuda
3234126908 99.5  0.0 329672 14952 pts/16   Rl   16:24   3:00 
./lmp_mpi_185_nocuda
USER   PID %CPU %MEMVSZ   RSS TTY  STAT START   TIME COMMAND
3234126907 99.4  0.0 329672 14960 pts/16   Rl   16:24   3:29 
./lmp_mpi_185_nocuda
3234126908 99.6  0.0 329672 14956 pts/16   Rl   16:24   3:30 
./lmp_mpi_185_nocuda
USER   PID %CPU %MEMVSZ   RSS TTY  STAT START   TIME COMMAND
3234126907 99.4  0.0 329672 14960 pts/16   Rl   16:24   3:59 
./lmp_mpi_185_nocuda
3234126908 99.6  0.0 329672 14956 pts/16   Rl   16:24   4:00 
./lmp_mpi_185_nocuda
USER   PID %CPU %MEMVSZ   RSS TTY  STAT START   TIME COMMAND
3234126907 99.4  0.0 329672 14960 pts/16   Rl   16:24   4:29 
./lmp_mpi_185_nocuda
3234126908 99.6  0.0 329672 14956 pts/16   Rl   16:24   4:30 
./lmp_mpi_185_nocuda
USER   PID %CPU %MEMVSZ   RSS TTY  STAT START   TIME COMMAND
3234126907 99.5  0.0 329672 14960 pts/16   Rl   16:24   4:59 
./lmp_mpi_185_nocuda
3234126908 99.6  0.0 329672 14956 pts/16   Rl   16:24   5:00 
./lmp_mpi_185_nocuda
USER   PID %CPU %MEMVSZ   RSS TTY  STAT START   TIME COMMAND
3234126907 99.5  0.0 329672 14960 pts/16   Rl   16:24   5:29 
./lmp_mpi_185_nocuda
3234126908 99.6  0.0 329672 14956 pts/16   Rl   16:24   5:29 
./lmp_mpi_185_nocuda
USER   PID %CPU %MEMVSZ   RSS TTY  STAT START   TIME COMMAND
3234126907 99.5  0.0 329672 14960 pts/16   Rl   16:24   5:59 
./lmp_mpi_185_nocuda
3234126908 99.6  0.0 329672 14956 pts/16   Rl   16:24   5:59 
./lmp_mpi_185_nocuda
USER   PID %CPU %MEMVSZ   RSS TTY  STAT START   TIME COMMAND

Open MPI 1.8.6

USER   PID %CPU %MEMVSZ   RSS TTY  STAT START   TIME COMMAND
3234126755  0.0  0.0 330288 15368 pts/16   Rl   16:10   0:00 
./lmp_mpi_186_nocuda
3234126756  0.0  0.0 330284 15376 pts/16   Rl   16:10   0:00 
./lmp_mpi_186_nocuda
USER   PID %CPU %MEMVSZ   RSS TTY  STAT START   TIME COMMAND
3234126755  100  0.0 409856 94976 pts/16   Rl   16:10   0:30 
./lmp_mpi_186_nocuda
3234126756  100  0.0 409848 94904 pts/16   Rl   16:10   0:30 
./lmp_mpi_186_nocuda
USER   PID %CPU %MEMVSZ   RSS TTY  STAT START   TIME COMMAND
3234126755  100  0.1 489292 174320 pts/16  Rl   16:10   1:00 
./lmp_mpi_186_nocuda
3234126756  100  0.1 489288 174536 pts/16  Rl   16:10   1:00