:
>
> Can you please try v3.12.3 There was some funky business mistakenly
> added related to partitioning that has been fixed in 3.12.3
>
>Barry
>
>
> > On Jan 10, 2020, at 1:57 PM, Santiago Andres Triana
> wrote:
> >
> > Dear all,
> >
> &
y for SuperLU_DIST.
>
> Suggest looking at the code, or running in the debugger to see what is
> going on there. We use parmetis all the time and don't see this.
>
> Barry
>
>
>
>
>
>
> On Jan 8, 2020, at 4:34 PM, Santiago Andres Triana
> wrote:
>
Gb (with 240 Gb
ram), but only up to 3Gb with the latest petsc/slepc.
Any suggestions, comments or any other help are very much appreciated!
Cheers,
Santiago
On Mon, Dec 23, 2019 at 11:19 PM Matthew Knepley wrote:
> On Mon, Dec 23, 2019 at 3:14 PM Santiago Andres Triana
> wrote:
>
Dear all,
After upgrading to petsc 3.12.2 my solver program crashes consistently.
Before the upgrade I was using petsc 3.9.4 with no problems.
My application deals with a complex-valued, generalized eigenvalue problem.
The matrices involved are relatively large, typically 2 to 10 Gb in size,
whic
Hello petsc-users:
I found this error when configure tries to download fblaslapack:
***
UNABLE to CONFIGURE with GIVEN OPTIONS(see configure.log for
details):
-
tput should be sent to the MUMPS developers.
>>
>>Hong,
>>
>> Can you send this to the MUMPS developers and see what they say?
>>
>> Thanks
>>
>>Barry
>>
>>
>> > On Oct 30, 2018, at 2:04 PM, Santiago Andres Tr
.
Thanks!
Santiago
On Sun, Oct 28, 2018 at 10:31 AM Dave May wrote:
>
>
> On Sun, 28 Oct 2018 at 09:37, Santiago Andres Triana
> wrote:
>
>> Hi petsc-users,
>>
>> I am experiencing problems running ex5 and ex7 from the slepc tutorial.
>> This is after upgrad
Hi petsc-users,
I am experiencing problems running ex5 and ex7 from the slepc tutorial.
This is after upgrade to petsc-3.10.2 and slepc-3.10.1. Has anyone run into
this problem? see the error message below. Any help or advice would be
highly appreciated. Thanks in advance!
Santiago
trianas@hpc
Dear Karl, Jed:
It was indeed the --with-fortran-kernels=1 option the culprit. Without it
the make check steps succeeds :)
Thanks so much for your prompt help!
Santiago
On Mon, Jul 30, 2018 at 6:58 PM, Karl Rupp wrote:
> Hi Santiago,
>
>
> I am trying to install petsc with the option --with
Dear petsc-users,
I am trying to install petsc with the option --with-precision=__float128.
The ./configure goes fine, as well as the make all stage. However, the make
check step to test the libraries fails with the following error:
/usr/bin/ld: home/spin/petsc-3.9.3/arch-linux2-c-opt/lib/libpets
he fact that B is singular should not be a problem, provided that you do
> shift-and-invert with a nonzero target value.
> Can you send the output of -eps_view so that I can get a better idea what
> you are doing?
>
> Jose
>
>
> > El 5 mar 2018, a las 0:50, Santiago An
Dear all,
A rather general question, is there any possibility of solving a
complex-valued generalized eigenvalue problem using quad (or extended)
precision when the 'B' matrix is singular? So far I have been using MUMPS
with double precision with good results but I require eventually extended
prec
Hi petsc-users,
What solvers (either petsc-native or external packages) are available for
quad precision (i.e. __float128) computations? I am dealing with a large
(1e6 x 1e6), sparse, complex-valued, non-hermitian, and non-symmetric
generalized eigenvalue problem. So far I have been using mumps
(K
t such messages. Or use
> different compilers..
>
> What do you have for:
>
> mpicc -show
>
>
> Satish
>
> On Wed, 20 Dec 2017, Santiago Andres Triana wrote:
>
> > Dear petsc-users,
> >
> > I'm trying to install petsc in a cluster using SGI'
Dear petsc-users,
I'm trying to install petsc in a cluster using SGI's MPT. The mpicc
compiler is in the search path. The configure command is:
./configure --with-scalar-type=complex --with-mumps=1 --download-mumps
--download-parmetis --download-metis --download-scalapack
However, this leads to
o reason to run it
> with --with-batch.
>
>Make test fails because it cannot launch parallel jobs directly using
> the mpiexec it is using.
>
>You need to determine how to submit jobs on this system and then you
> are ready to go.
>
>Barry
>
>
> > O
Dear petsc-users,
I'm trying to install petsc in a cluster that uses a job manager. This is
the configure command I use:
./configure --known-mpi-shared-libraries=1 --with-scalar-type=complex
--with-mumps=1 --download-mumps --download-parmetis
--with-blaslapack-dir=/sw/sdev/intel/psxe2015u3/compo
17 matches
Mail list logo