TED]>
Reply-To: Discussion list for GROMACS users
To: Discussion list for GROMACS users
Subject: Re: [gmx-users] GROMACS Parallel Runs
Date: Fri, 06 Oct 2006 10:54:12 +0200
Sunny wrote:
From: "Dallas B. Warren" <[EMAIL PROTECTED]>
Reply-To: Discussion list for GROMACS users
T
ply-To: Discussion list for GROMACS users
To: Discussion list for GROMACS users
Subject: Re: [gmx-users] GROMACS Parallel Runs
Date: Fri, 06 Oct 2006 10:54:12 +0200
Sunny wrote:
From: "Dallas B. Warren" <[EMAIL PROTECTED]>
Reply-To: Discussion list for GROMACS users
To: Discussion
Sunny wrote:
From: "Dallas B. Warren" <[EMAIL PROTECTED]>
Reply-To: Discussion list for GROMACS users
To: Discussion list for GROMACS users
Subject: RE: [gmx-users] GROMACS Parallel Runs
Date: Fri, 06 Oct 2006 09:35:31 +1000
> I have successfully run gmx on up to 128 cpus
From: "Dallas B. Warren" <[EMAIL PROTECTED]>
Reply-To: Discussion list for GROMACS users
To: Discussion list for GROMACS users
Subject: RE: [gmx-users] GROMACS Parallel Runs
Date: Fri, 06 Oct 2006 09:35:31 +1000
> I have successfully run gmx on up to 128 cpus. When I scale
> I have successfully run gmx on up to 128 cpus. When I scale
> to 256 cpus, the
> following error occurs. Does it mean that gmx can't be run on
> 256 nodes?
>
> Fatal error:
> could not find a grid spacing with nx and ny divisible by the
> number of
> nodes (256)
Isn't that just due to the
quot;Sunny" <[EMAIL PROTECTED]>
Reply-To: Discussion list for GROMACS users
To: gmx-users@gromacs.org
Subject: Re: [gmx-users] GROMACS Parallel Runs
Date: Tue, 03 Oct 2006 09:57:43 +
Hi Carsten,
Setting fourier_nx to a larger number does work.
Thanks.
Sunny
From: Carsten Kutzner <
Sunny wrote:
Hi Carsten,
Setting fourier_nx to a larger number does work.
Thanks.
Sunny
From: Carsten Kutzner <[EMAIL PROTECTED]>
Reply-To: Discussion list for GROMACS users
To: Discussion list for GROMACS users
Subject: Re: [gmx-users] GROMACS Parallel Runs
Date: Mon, 02 Oct 2006
Hi Carsten,
Setting fourier_nx to a larger number does work.
Thanks.
Sunny
From: Carsten Kutzner <[EMAIL PROTECTED]>
Reply-To: Discussion list for GROMACS users
To: Discussion list for GROMACS users
Subject: Re: [gmx-users] GROMACS Parallel Runs
Date: Mon, 02 Oct 2006 11:06:49 +02
Sunny wrote:
Hi David,
From: David van der Spoel <[EMAIL PROTECTED]>
Reply-To: Discussion list for GROMACS users
To: Discussion list for GROMACS users
Subject: Re: [gmx-users] GROMACS Parallel Runs
Date: Mon, 02 Oct 2006 16:04:45 +0200
Sunny wrote:
Hi all,
Thanks for your pr
Hi David,
From: David van der Spoel <[EMAIL PROTECTED]>
Reply-To: Discussion list for GROMACS users
To: Discussion list for GROMACS users
Subject: Re: [gmx-users] GROMACS Parallel Runs
Date: Mon, 02 Oct 2006 16:04:45 +0200
Sunny wrote:
Hi all,
Thanks for your proposed solutions.
n list for GROMACS users
Subject: Re: [gmx-users] GROMACS Parallel Runs
Date: Mon, 02 Oct 2006 11:06:49 +0200
Hi,
the current version of gmx requires at least pme_order/2 grid points
per processor
for the x-dimension of the pme grid. With pme_order=4 and
fourier_nx=64 you end up with
only one grid
in short time. I will contact the system
support to see if they would do the update.
Thanks,
Sunny
From: Carsten Kutzner <[EMAIL PROTECTED]>
Reply-To: Discussion list for GROMACS users
To: Discussion list for GROMACS users
Subject: Re: [gmx-users] GROMACS Parallel Runs
Date: Mon,
n list for GROMACS users
Subject: Re: [gmx-users] GROMACS Parallel Runs
Date: Sun, 01 Oct 2006 19:58:48 +0200
Sunny wrote:
Hi,
I am using GROMACS 3.3.1 parallel runs on an AIX supercomputing
system. My simulation can successfully run on 16 and 32 CPUs (as well
as below 16 CPUs). When running
Sunny wrote:
From: David van der Spoel <[EMAIL PROTECTED]>
Reply-To: Discussion list for GROMACS users
To: Discussion list for GROMACS users
Subject: Re: [gmx-users] GROMACS Parallel Runs
Date: Sun, 01 Oct 2006 19:58:48 +0200
Sunny wrote:
Hi,
I am using GROMACS 3.3.1 parallel runs on
From: David van der Spoel <[EMAIL PROTECTED]>
Reply-To: Discussion list for GROMACS users
To: Discussion list for GROMACS users
Subject: Re: [gmx-users] GROMACS Parallel Runs
Date: Sun, 01 Oct 2006 19:58:48 +0200
Sunny wrote:
Hi,
I am using GROMACS 3.3.1 parallel runs on
Sunny wrote:
Hi,
I am using GROMACS 3.3.1 parallel runs on an AIX supercomputing system.
My simulation can successfully run on 16 and 32 CPUs (as well as below
16 CPUs). When running on 64 CPUs, however, segmentation fault occurs in
multiple tasks from very beginning of the simulation. I'd li
Sunny wrote:
Hi,
I am using GROMACS 3.3.1 parallel runs on an AIX supercomputing system.
My simulation can successfully run on 16 and 32 CPUs (as well as below
16 CPUs). When running on 64 CPUs, however, segmentation fault occurs in
multiple tasks from very beginning of the simulation. I'd li
Hi,
I am using GROMACS 3.3.1 parallel runs on an AIX supercomputing system. My
simulation can successfully run on 16 and 32 CPUs (as well as below 16
CPUs). When running on 64 CPUs, however, segmentation fault occurs in
multiple tasks from very beginning of the simulation. I'd like know what
18 matches
Mail list logo