On Thu, Jun 2, 2016 at 5:27 PM, Luc Berger-Vergiat
wrote:
> Ok I get it, then if I have multiple subdomains on the local processor is
> and is_local will be arrays of is that represent each subdomain on that
> processor?
>
Yep.
Matt
> Best,
> Luc
>
> On 06/02/2016 06:21 PM, Matthew Knepley
Ok I get it, then if I have multiple subdomains on the local processor
is and is_local will be arrays of is that represent each subdomain on
that processor?
Best,
Luc
On 06/02/2016 06:21 PM, Matthew Knepley wrote:
On Thu, Jun 2, 2016 at 5:11 PM, Luc Berger-Vergiat
mailto:lb2...@columbia.edu>
On Thu, Jun 2, 2016 at 5:11 PM, Luc Berger-Vergiat
wrote:
> Hi all,
> I would like a quick clarification on what is and is_local are
> representing in the PCASMSetLocalSubdomains().
> My understanding is that if I have two mpi ranks and four subdomains I can
> end up having four blocks that I can
Hi all,
I would like a quick clarification on what is and is_local are
representing in the PCASMSetLocalSubdomains().
My understanding is that if I have two mpi ranks and four subdomains I
can end up having four blocks that I can denote as follows:
| domain1 | domain2 | domain
Re,
Makes sense to read the documentation, I will try with another
preconditioners.
Thanks for the support.
2016-06-02 18:15 GMT+02:00 Matthew Knepley :
> On Thu, Jun 2, 2016 at 11:10 AM, neok m4700 wrote:
>
>> Hi Satish,
>>
>> Thanks for the correction.
>>
>> The error message is now slightly
On Thu, Jun 2, 2016 at 11:10 AM, neok m4700 wrote:
> Hi Satish,
>
> Thanks for the correction.
>
> The error message is now slightly different, but the result is the same
> (serial runs fine, parallel with mpirun fails with following error):
>
Now the error is correct. You are asking to run ICC
Hi Satish,
Thanks for the correction.
The error message is now slightly different, but the result is the same
(serial runs fine, parallel with mpirun fails with following error):
[0] KSPSolve() line 599 in <...>/src/ksp/ksp/interface/itfunc.c
[0] KSPSetUp() line 390 in <...>/src/ksp/ksp/interfac
with petsc-master - you would have to use petsc4py-master.
i.e try petsc-eab7b92 with petsc4py-6e8e093
Satish
On Thu, 2 Jun 2016, neok m4700 wrote:
> Hi Matthew,
>
> I've rebuilt petsc // petsc4py with following versions:
>
> 3.7.0 // 3.7.0 => same runtime error
> 00c67f3 // 3.7.1 => fails t
Hi Matthew,
I've rebuilt petsc // petsc4py with following versions:
3.7.0 // 3.7.0 => same runtime error
00c67f3 // 3.7.1 => fails to build petsc4py (error below)
00c67f3 // 6e8e093 => same as above
f1b0812 (latest commit) // 6e8e093 (latest commit) => same as above
In file included from src/PET
On Thu, Jun 2, 2016 at 9:12 AM, neok m4700 wrote:
> Hi,
>
> I built petsc 3.7.1 and petsc4py 3.7.0 (with openmpi 1.10.2) and ran the
> examples in the demo directory.
>
I believe this was fixed in 'master':
https://bitbucket.org/petsc/petsc/commits/00c67f3b09c0bcda06af5ed306d845d9138e5003
Is it
Hi,
I built petsc 3.7.1 and petsc4py 3.7.0 (with openmpi 1.10.2) and ran the
examples in the demo directory.
$ python test_mat_ksp.py
=> runs as expected (serial)
$ mpiexec -np 2 python test_mat_ksp.py
=> fails with the following output:
Traceback (most recent call last):
File "<...>/demo/ksp
11 matches
Mail list logo