Re: [petsc-users] PETSC matrix assembling super slow

2019-02-05 Thread Smith, Barry F. via petsc-users
Send configure.log and make.log to petsc-ma...@mcs.anl.gov Barry > On Feb 5, 2019, at 10:48 PM, Yaxiong Chen wrote: > > Since mumps and scalapack are already installed on my computer, I only ran > ./configure with --download-superlu_dist . > After everything is done, I received the

Re: [petsc-users] PETSC matrix assembling super slow

2019-02-05 Thread Yaxiong Chen via petsc-users
Since mumps and scalapack are already installed on my computer, I only ran ./configure with --download-superlu_dist . After everything is done, I received the error: dyld: lazy symbol binding failed: Symbol not found: _MatSolverTypeRegister_SuperLU_DIST Referenced from:

Re: [petsc-users] PETSC matrix assembling super slow

2019-02-05 Thread Smith, Barry F. via petsc-users
Run ./configure with --download-superlu_dist --download-mumps --download-scalapack and the you can use either -pc_factor_mat_solver_type superlu_dist or -pc_factor_mat_solver_type mumps Good luck > On Feb 5, 2019, at 9:29 PM, Yaxiong Chen wrote: > > > Also, I found the solving time is

Re: [petsc-users] PETSC matrix assembling super slow

2019-02-05 Thread Yaxiong Chen via petsc-users
> Also, I found the solving time is also shorter when I use the direct > solver(0.432s vs 4.332 s ). Is this due the small scale of the system? When > I have a very large (e.g., 10*10 ) system, can I expect iterative > solver being faster? << It sounds like the default

Re: [petsc-users] Preconditioning systems of equations with complex numbers

2019-02-05 Thread Abhyankar, Shrirang G via petsc-users
Hi Justin, Typically, the power grid distribution systems have a radial structure (unless it is an urban area) that leads to a, more or less, staircase type matrix. So a MatLoad() or VecLoad() would presumably just just splits the stairs, akin to a 1-D PDE. However, as you pointed out, it

Re: [petsc-users] reading petsc binary files.

2019-02-05 Thread Smith, Barry F. via petsc-users
> On Feb 5, 2019, at 5:06 PM, Sanjay Kharche via petsc-users > wrote: > > > Hi > > I use two mpi clusters (cluster 1 and 2). Whereas the petsc binary files I > generate can be read on cluster 1, I get errors doing so on cluster 2. I also > output vts files corresponding to each binary

Re: [petsc-users] [TimeStepping] Eventhandler

2019-02-05 Thread Jed Brown via petsc-users
Dolfin will need this PR to work with any PETSc 3.10. https://bitbucket.org/fenics-project/dolfin/pull-requests/508/jed-petsc-310/diff It's been chillin' there for a couple months; it appears that much of the Dolfin development effort has moved to DolfinX and Firedrake. "Huck, Moritz" writes:

[petsc-users] reading petsc binary files.

2019-02-05 Thread Sanjay Kharche via petsc-users
Hi I use two mpi clusters (cluster 1 and 2). Whereas the petsc binary files I generate can be read on cluster 1, I get errors doing so on cluster 2. I also output vts files corresponding to each binary file output, and it appears that both clusters do produce meaningful results. I use ver

Re: [petsc-users] PETSC matrix assembling super slow

2019-02-05 Thread Mark Adams via petsc-users
On Mon, Feb 4, 2019 at 4:17 PM Yaxiong Chen wrote: > Hi Mark, > > Will the parameter MatMPIAIJSetPreallocation in influence the > following part > do i=mystart,nelem,nproc > call ptSystem%getElementalMAT(i, Ae, auxRHSe, idx) > ne=size(idx) >

Re: [petsc-users] MWE for DMPlexCreateCGNS

2019-02-05 Thread Matthew Knepley via petsc-users
On Tue, Feb 5, 2019 at 11:13 AM Andrew Parker wrote: > On Tue, 5 Feb 2019 at 15:27, Matthew Knepley wrote: > >> On Tue, Feb 5, 2019 at 9:47 AM Andrew Parker via petsc-users < >> petsc-users@mcs.anl.gov> wrote: >> >>> Does anyone have a MWE for DMPlexCreateCGNS to use in parallel? Ideally, >>>

Re: [petsc-users] PETSC matrix assembling super slow

2019-02-05 Thread Smith, Barry F. via petsc-users
> On Feb 5, 2019, at 10:32 AM, Yaxiong Chen wrote: > > Thanks Barry, > I will explore how to partition for parallel computation later. But now > I still have some confusion on the sequential operation. > I compared PETSC and Mumps. In both case, the subroutine for generating > elemental

Re: [petsc-users] [TimeStepping] Eventhandler

2019-02-05 Thread Huck, Moritz via petsc-users
I am using the fenics library for my PDE discretization. I can not compile fenics with PETSc 3.10.3. I will locate the exact error tomorrow. Von: Jed Brown Gesendet: Dienstag, 5. Februar 2019 15:30:38 An: Huck, Moritz; Smith, Barry F.; Abhyankar, Shrirang

Re: [petsc-users] [TimeStepping] Eventhandler

2019-02-05 Thread Abhyankar, Shrirang G via petsc-users
Ok, I'll add two things; (i) a TSEventSetPostEventTimeStep(), and (ii) a flag that allows the user to select either the previous time-step or the original time-step. For most users, the flag should suffice. For users who know how the dynamics would behave after the event, they can set the

Re: [petsc-users] Preconditioning systems of equations with complex numbers

2019-02-05 Thread Mark Adams via petsc-users
I would stay away from eigen estimates in the solver (but give us the spectra to look at), so set -pc_gamg_agg_nsmooths 0 and use sor. Applications that have lived on direct solvers can add sorts of crap like penalty terms. sor seemed to work OK so I'd check the coarse grids in GAMG. Test with

Re: [petsc-users] Store type (Eigen::Vector2d) in a petsc vec

2019-02-05 Thread Jed Brown via petsc-users
Andrew Parker writes: > Thanks, so you would suggest a flat vector storing u, v, w (or indeed x, y, > z) or interleaved and then construct eigen types on the fly? Interleaved if you want to use Eigen types in the same memory, or if your code (like most applications) benefits more from memory

Re: [petsc-users] Store type (Eigen::Vector2d) in a petsc vec

2019-02-05 Thread Andrew Parker via petsc-users
Thanks, so you would suggest a flat vector storing u, v, w (or indeed x, y, z) or interleaved and then construct eigen types on the fly? Can I ask, is that because Vec cannot store user defined types (as in it's not templatetable?) Thanks, Andy On Tue, 5 Feb 2019 at 14:22, Jed Brown wrote: >

Re: [petsc-users] Installing PETSc

2019-02-05 Thread Jed Brown via petsc-users
Fazlul Huq via petsc-users writes: > Hello PETSc Developers, > > may be this is a trivial question! > > I usually run PETSc code from Home/petsc-3.10.2 directory. Last day I tried > to run the code from Documents/petsc directory but I can't. As far as I can > recall, I have installed PETSc in

Re: [petsc-users] Installing PETSc

2019-02-05 Thread Balay, Satish via petsc-users
On Tue, 5 Feb 2019, Fazlul Huq via petsc-users wrote: > Hello PETSc Developers, > > may be this is a trivial question! > > I usually run PETSc code from Home/petsc-3.10.2 directory. Last day I tried > to run the code from Documents/petsc directory but I can't. I don't know what you mean here -

Re: [petsc-users] [TimeStepping] Eventhandler

2019-02-05 Thread Jed Brown via petsc-users
"Huck, Moritz via petsc-users" writes: > @Shri > The system is very stiff, but the stiffness is handled well by ARKIMEX. > > I'am using PETSc 3.10. (I cannot use 3.10.3 at the moment due to > compatibilty with a third library), What compatibility problem is this? 3.10.3 should be (binary and

Re: [petsc-users] Store type (Eigen::Vector2d) in a petsc vec

2019-02-05 Thread Jed Brown via petsc-users
My suggestion is to use PETSc like usual and inside your residual/Jacobian evaluation, for each cell or batch of cells, create Eigen objects. For size 2d or 3d, it won't matter much whether you make them share memory with the PETSc Vec -- the Eigen types should mostly exist in registers. Andrew

Re: [petsc-users] [TimeStepping] Eventhandler

2019-02-05 Thread Huck, Moritz via petsc-users
@Shri The system is very stiff, but the stiffness is handled well by ARKIMEX. I'am using PETSc 3.10. (I cannot use 3.10.3 at the moment due to compatibilty with a third library), no special options for TS Writing an MWE might take some time, if it is necessary I will write something at the

Re: [petsc-users] Problem in Loading Matrix in Examples

2019-02-05 Thread Eda Oktay via petsc-users
Dear Matt, Thank you for answering. I didn't recognize that part. But I still can't find a matrix called "small" and there are some examples testing with "medium". Am I missing something? Thanks, Eda Matthew Knepley , 5 Şub 2019 Sal, 13:35 tarihinde şunu yazdı: > On Tue, Feb 5, 2019 at 4:44

[petsc-users] Problem in Loading Matrix in Examples

2019-02-05 Thread Eda Oktay via petsc-users
Hi, I am new in PETSc and trying to run some examples in MAT file. I cannot run the ones that need to load matrix from file. Here are the examples: For ex12 in MAT file: args: -f0 ${wPETSC_DIR}/share/petsc/datafiles/matrices/ns-real-int32-float64 requires: double !complex

Re: [petsc-users] Doubt on how to copy a Mat into another (Fortran)

2019-02-05 Thread Marco Tiberga via petsc-users
Dear Matt, Thanks for your reply, now it’s clear to me also why matArray is not destroyed, contrarily to Amat. Do you think that declaring explicitly matArray as a (Fortran) pointer (at line 113) and then using “matArray(1) => Amat” would be equivalent? It would be certainly clearer from the