Re: [petsc-users] killed 9 signal after upgrade from petsc 3.9.4 to 3.12.2

2020-01-10 Thread Santiago Andres Triana
: > > Can you please try v3.12.3 There was some funky business mistakenly > added related to partitioning that has been fixed in 3.12.3 > >Barry > > > > On Jan 10, 2020, at 1:57 PM, Santiago Andres Triana > wrote: > > > > Dear all, > > > &

Re: [petsc-users] killed 9 signal after upgrade from petsc 3.9.4 to 3.12.2

2020-01-09 Thread Santiago Andres Triana
y for SuperLU_DIST. > > Suggest looking at the code, or running in the debugger to see what is > going on there. We use parmetis all the time and don't see this. > > Barry > > > > > > > On Jan 8, 2020, at 4:34 PM, Santiago Andres Triana > wrote: >

Re: [petsc-users] killed 9 signal after upgrade from petsc 3.9.4 to 3.12.2

2020-01-08 Thread Santiago Andres Triana
Gb (with 240 Gb ram), but only up to 3Gb with the latest petsc/slepc. Any suggestions, comments or any other help are very much appreciated! Cheers, Santiago On Mon, Dec 23, 2019 at 11:19 PM Matthew Knepley wrote: > On Mon, Dec 23, 2019 at 3:14 PM Santiago Andres Triana > wrote: >

[petsc-users] killed 9 signal after upgrade from petsc 3.9.4 to 3.12.2

2019-12-23 Thread Santiago Andres Triana
Dear all, After upgrading to petsc 3.12.2 my solver program crashes consistently. Before the upgrade I was using petsc 3.9.4 with no problems. My application deals with a complex-valued, generalized eigenvalue problem. The matrices involved are relatively large, typically 2 to 10 Gb in size, whic

[petsc-users] problem downloading "fix-syntax-for-nag.tar.gx"

2019-11-19 Thread Santiago Andres Triana via petsc-users
Hello petsc-users: I found this error when configure tries to download fblaslapack: *** UNABLE to CONFIGURE with GIVEN OPTIONS(see configure.log for details): -

Re: [petsc-users] Segmentation violation

2018-10-31 Thread Santiago Andres Triana via petsc-users
tput should be sent to the MUMPS developers. >> >>Hong, >> >> Can you send this to the MUMPS developers and see what they say? >> >> Thanks >> >>Barry >> >> >> > On Oct 30, 2018, at 2:04 PM, Santiago Andres Tr

Re: [petsc-users] [SLEPc] ex5 fails, error in lapack

2018-10-28 Thread Santiago Andres Triana
. Thanks! Santiago On Sun, Oct 28, 2018 at 10:31 AM Dave May wrote: > > > On Sun, 28 Oct 2018 at 09:37, Santiago Andres Triana > wrote: > >> Hi petsc-users, >> >> I am experiencing problems running ex5 and ex7 from the slepc tutorial. >> This is after upgrad

[petsc-users] [SLEPc] ex5 fails, error in lapack

2018-10-28 Thread Santiago Andres Triana
Hi petsc-users, I am experiencing problems running ex5 and ex7 from the slepc tutorial. This is after upgrade to petsc-3.10.2 and slepc-3.10.1. Has anyone run into this problem? see the error message below. Any help or advice would be highly appreciated. Thanks in advance! Santiago trianas@hpc

Re: [petsc-users] problem with installation using quad precision

2018-07-30 Thread Santiago Andres Triana
Dear Karl, Jed: It was indeed the --with-fortran-kernels=1 option the culprit. Without it the make check steps succeeds :) Thanks so much for your prompt help! Santiago On Mon, Jul 30, 2018 at 6:58 PM, Karl Rupp wrote: > Hi Santiago, > > > I am trying to install petsc with the option --with

[petsc-users] problem with installation using quad precision

2018-07-30 Thread Santiago Andres Triana
Dear petsc-users, I am trying to install petsc with the option --with-precision=__float128. The ./configure goes fine, as well as the make all stage. However, the make check step to test the libraries fails with the following error: /usr/bin/ld: home/spin/petsc-3.9.3/arch-linux2-c-opt/lib/libpets

Re: [petsc-users] Generalized eigenvalue problem using quad precision

2018-03-05 Thread Santiago Andres Triana
he fact that B is singular should not be a problem, provided that you do > shift-and-invert with a nonzero target value. > Can you send the output of -eps_view so that I can get a better idea what > you are doing? > > Jose > > > > El 5 mar 2018, a las 0:50, Santiago An

[petsc-users] Generalized eigenvalue problem using quad precision

2018-03-04 Thread Santiago Andres Triana
Dear all, A rather general question, is there any possibility of solving a complex-valued generalized eigenvalue problem using quad (or extended) precision when the 'B' matrix is singular? So far I have been using MUMPS with double precision with good results but I require eventually extended prec

[petsc-users] quad precision solvers

2017-12-31 Thread Santiago Andres Triana
Hi petsc-users, What solvers (either petsc-native or external packages) are available for quad precision (i.e. __float128) computations? I am dealing with a large (1e6 x 1e6), sparse, complex-valued, non-hermitian, and non-symmetric generalized eigenvalue problem. So far I have been using mumps (K

Re: [petsc-users] configure cannot find a c preprocessor

2017-12-20 Thread Santiago Andres Triana
t such messages. Or use > different compilers.. > > What do you have for: > > mpicc -show > > > Satish > > On Wed, 20 Dec 2017, Santiago Andres Triana wrote: > > > Dear petsc-users, > > > > I'm trying to install petsc in a cluster using SGI'

[petsc-users] configure cannot find a c preprocessor

2017-12-20 Thread Santiago Andres Triana
Dear petsc-users, I'm trying to install petsc in a cluster using SGI's MPT. The mpicc compiler is in the search path. The configure command is: ./configure --with-scalar-type=complex --with-mumps=1 --download-mumps --download-parmetis --download-metis --download-scalapack However, this leads to

Re: [petsc-users] configure fails with batch+scalapack

2017-12-19 Thread Santiago Andres Triana
o reason to run it > with --with-batch. > >Make test fails because it cannot launch parallel jobs directly using > the mpiexec it is using. > >You need to determine how to submit jobs on this system and then you > are ready to go. > >Barry > > > > O

[petsc-users] configure fails with batch+scalapack

2017-12-17 Thread Santiago Andres Triana
Dear petsc-users, I'm trying to install petsc in a cluster that uses a job manager. This is the configure command I use: ./configure --known-mpi-shared-libraries=1 --with-scalar-type=complex --with-mumps=1 --download-mumps --download-parmetis --with-blaslapack-dir=/sw/sdev/intel/psxe2015u3/compo