Hi Frank,
On 11 July 2016 at 19:14, frank wrote:
> Hi Dave,
>
> I re-run the test using bjacobi as the preconditioner on the coarse mesh
> of telescope. The Grid is 3072*256*768 and process mesh is 96*8*24. The
> petsc option file is attached.
> I still got the "Out Of Memory" error. The error
> On 11 Jul 2016, at 21:06, Jose E. Roman wrote:
>
> I don't understand why I don't get this warning.
> Still I don't see where the problem is. Please tell me exactly what you want
> me to change, or better make a pull request.
The problem has to do with the assumptions in python scripts. See
I don't understand why I don't get this warning.
Still I don't see where the problem is. Please tell me exactly what you want me
to change, or better make a pull request.
Thanks.
Jose
> El 11 jul 2016, a las 17:06, Denis Davydov escribió:
>
> Here is the warning:
>
> Your SLEPC_DIR may not m
On Mon, Jul 11, 2016 at 1:22 PM, Ketan Maheshwari <
ketancmaheshw...@gmail.com> wrote:
> Matthew,
>
> I am probably not using the right language but I meant that each element
> has three indices associated with it: x, y, z.
>
> Here is a snapshot:
>
> 1 10 555.7113635929515209e-03
> 1 10 56
Matthew,
I am probably not using the right language but I meant that each element
has three indices associated with it: x, y, z.
Here is a snapshot:
1 10 555.7113635929515209e-03
1 10 564.2977490038287334e-03
1 10 572.8719519782193204e-03
1 10 581.4380140927001712e-03
1 10 59
Hi Dave,
I re-run the test using bjacobi as the preconditioner on the coarse mesh
of telescope. The Grid is 3072*256*768 and process mesh is 96*8*24. The
petsc option file is attached.
I still got the "Out Of Memory" error. The error occurred before the
linear solver finished one step. So I do
On Mon, Jul 11, 2016 at 12:05 PM, Ketan Maheshwari <
ketancmaheshw...@gmail.com> wrote:
> Hello PETSC-ers,
>
> I am a research faculty at Univ of Pittsburgh trying to use PETSC/SLEPC to
> obtain the diagonalization of a large matrix using Lanczos or Davidson
> method.
>
> The matrix is a 3 dimensi
Hello PETSC-ers,
I am a research faculty at Univ of Pittsburgh trying to use PETSC/SLEPC to
obtain the diagonalization of a large matrix using Lanczos or Davidson
method.
The matrix is a 3 dimensional dense matrix with a total of 216000 elements.
After looking into some of the examples in PETSC
Here is the warning:
Your SLEPC_DIR may not match the directory you are in
SLEPC_DIR
/Users/davydden/spack/var/spack/stage/slepc-3.7.1-p7hqqclwqvbvra6j44lka3xuc4eycvdg/slepc-3.7.1
Current directory
/private/var/folders/5k/sqpp24tx3ylds4fgm13pfht0gn/T/davydden/spack-stage/spack-stage-m7Xg8I
On Mon, Jul 11, 2016 at 3:13 AM, Marco Zocca wrote:
> Sorry for the previous mail, I hadn't fully read ./configure --help :
> all external package options are listed there, including HDF5
>
> As far as I can see in
> https://www.mcs.anl.gov/petsc/miscellaneous/external.html and on the
> PDF manu
I cannot reproduce this behaviour. If I do for instance this (on OS X El
Capitan):
$ cd ~/tmp
$ ln -s $SLEPC_DIR .
$ cd slepc-3.7.1
$ ./configure
$ make
$ otool -lv $PETSC_ARCH/lib/libslepc.dylib | grep slepc
I don't get a warning, and the output of otool is the same that would result if
done o
Sorry for the previous mail, I hadn't fully read ./configure --help :
all external package options are listed there, including HDF5
As far as I can see in
https://www.mcs.anl.gov/petsc/miscellaneous/external.html and on the
PDF manual, not all external packages are mentioned, and this tripped
me
Good morning,
Does the HDF5 functionality need to be explicitly requested at
configure time? I just noticed that my default configuration on a
single-node machine does not compile any relevant symbol.
I do not have HDF5 installed on my system yet, but I assumed PETSc
includes it by default, or
13 matches
Mail list logo